Hello! I can’t solve the problem.

This is the output

changed: [localhost] => (item=vol-xxxxxxxxxxxxxxxx) => {
   "ansible_loop_var": "item",
   "changed": true,
   "invocation": {
       "module_args": {
           "aws_access_key": null,
           "aws_ca_bundle": null,
           "aws_config": null,
           "aws_secret_key": null,
           "debug_botocore_endpoint_logs": false,
           "description": null,
           "device_name": null,
           "ec2_url": null,
           "instance_id": null,
           "last_snapshot_min_age": 0,
           "profile": null,
           "region": null,
           "security_token": null,
           "snapshot_id": null,
           "snapshot_tags": {
               "MarkedForDeletion": true
           },
           "state": "present",
           "validate_certs": true,
           "volume_id": "vol-xxxxxxxxxxxxxxxx",
           "wait": true,
           "wait_timeout": 900
       }
   },
   "item": "vol-xxxxxxxxxxxxxxxx",
   "snapshot_id": "snap-xxxxxxxxxxxxxxxx",
   "snapshots": [
       {
           "description": "",
           "encrypted": false,
           "owner_id": "xxxxxxxxxxxxxxxxxxx",
           "progress": "",
           "response_metadata": {
               "http_headers": {
                   "cache-control": "no-cache, no-store",
                   "content-length": "674",
                   "content-type": "text/xml;charset=UTF-8",
                   "date": "Thu, 05 Jan 2023 14:17:16 GMT",
                   "server": "AmazonEC2",
                   "strict-transport-security": "max-age=31536000;
includeSubDomains",
                   "x-amzn-requestid": "xxxxxxxxxxxxxxxxxxxx"
               },
               "http_status_code": 200,
               "request_id": "xxxxxxxxxxxxxxxxxxxxxxxxx",
               "retry_attempts": 0
           },
           "snapshot_id": "snap-xxxxxxxxxxx",
           "start_time": "2023-01-05T14:17:16.785000+00:00",
           "state": "pending",
           "tags": {
               "MarkedForDeletion": "True"
           },
           "volume_id": "vol-xxxxxxxxxx",
           "volume_size": 400
       }
   ],
   "tags": {
       "MarkedForDeletion": "True"
   },
   "volume_id": "vol-xxxxxxxxxxxxx",
   "volume_size": 400
}

i see this error

TASK [Debug]
**********************************************************************************************************************************************************

task path: /etc/ansible/ec2/create_ec2_instance/playbook/test.yaml:19
fatal: [localhost]: FAILED! => {
   "msg": "The task includes an option with an undefined variable. The
error was: 'dict object' has no attribute 'snapshot_id'\n\nThe error
appears to be in '/etc/ans
ible/ec2/create_ec2_instance/playbook/test.yaml': line 19, column 7, but
may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe
offending line app
ears to be:\n\n\n    - name: \"Debug\"\n      ^ here\n"


}

PLAY RECAP
************************************************************************************************************************************************************

localhost                  : ok=1    changed=1    unreachable=0    failed=1
   skipped=0    rescued=0    ignored=0

Here my playbook:

---
- name: "test"
  hosts: localhost
  gather_facts: no
  vars:
    aws_volume_id:
      - vol-xxxxxxxxxxxxxxxxx
      - vol-xxxxxxxxxxxxxxxxx
  tasks:
    - name: "test"
      amazon.aws.ec2_snapshot:
        volume_id: "{{ item }}"
        wait_timeout: 900
        snapshot_tags:
            MarkedForDeletion: true
      register: snapshotid
      with_items: "{{ aws_volume_id }}"

    - name: "Debug"
      debug:
        msg: "{{ item.snapshot_id }}"
      with_items: "{{ snapshotid }}"

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/CAGUDtnmaJLW_9d1UNukaR-amduefyoSRqz_K0s1qFB8bg7aSRg%40mail.gmail.com.

Reply via email to