Thanks for the reply, I ended using 
- name: Find file >=1GB 
   find: 
     path: /var/log 
     patterns: "wasabi-intuit-main*.log" 
     file_type: file 
     size: 1g 
     recursive: true
   register: files_too_large 

because the file is in another level deep than /var/log and the nested 
loops seems to be too complicated.  I would really love is find's path can 
also take a regex instead of just the pattern on the file matching.




On Tuesday, November 15, 2016 at 9:06:59 AM UTC-8, Kai Stian Olstad wrote:
>
> On 15. nov. 2016 00:15, Felix Gao wrote: 
> > thanks for your explanation and I have made some progress but I am 
> further 
> > stopped by the next stage. 
> > My problem is that I have a dynamic directory that generate log file in 
> > certain directories with the following format 
> > 
> /var/log/app-YYYYMMDD-RANDOM_HASH/log_file_name-YYYYMMDDMMSS-[audit,console,access,].log
>  
>
> > and I am trying to find all the logs in that directory that are more 
> than 
> > 1GB then trim it. 
>
> Can't you just do this in one find? 
>
> - name: Find file >=1GB 
>    find: 
>      path: /var/log 
>      patterns: "wasabi-intuit-main*.log" 
>      file_type: file 
>      size: 1g 
>    register: files_too_large 
>
>
> > 
> > I have these in my tasks 
> > 
> >   tasks: 
> >       #using the shell command because we need * expansion, otherwise if 
> we 
> > know the exact directory we can use command module instead 
> >       - name: list log directory to find wasabi main directory name 
> >         find: 
> >             paths: [ "/var/log/" ] 
> >             patterns: "wasabi-intuit-main*" 
> >             file_type: directory 
> >         register: out_directories 
> >         ignore_errors: True 
> > 
> >       - name: list log files for wasabi intuit main 
> >         find: 
> >             paths: "{{item.path}}" 
> >             patterns: "wasabi-intuit-main*.log" 
> >             file_type: file 
> >         register: out_files 
> >         with_items: "{{ out_directories.files }}" 
> >         ignore_errors: True 
> > 
> > but it seems the returned out_files variable is a dict.  the key is 
> another 
> > dict of the previous job and the value is a dict from find return values 
> > with added properties like "changed","examined", and "msg".  now I am 
> > confused on how to iterate that object so I can filter the result. 
> > 
> > I have tried "{{ out_files.values().files }}" and "{{ 
> > out_files.results.files }}" which does not seems to work 
>
> This is going to be somewhat complicated, I do recommend looking at 
> doing it with less loops. 
>
> out_file.results.0.files will contain all files in the fist directory 
> from out_directories 
> out_file.results.1.files will contain all files in the second directory 
> from out_directories 
> ... 
> ... 
>
>
> If you still want to iterate on out_files you will have to look av 
> with_subelements. 
>
> Probably someting like this 
>    - debug: var=item.1.path 
>      with_subelements: 
>        - "{{ out_file.results }}" 
>        - files 
>
>
> -- 
> Kai Stian Olstad 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/15dfc3cd-6504-47d4-be32-41852649b309%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to