• Uncategorized

About linux : Ansible-on-Linux-controller—s3-module-to-copy-from-windows-host-to-s3-bucket

Question Detail

I am running my ansible on an AWS EC2 linux machine which connects to another AWS EC2 Windows machine to copy a file to S3 bucket

my tasks/main.yml file looks like below

# tasks file for postgres

- name: Simple PUT operation
    bucket: codepipeline-artefact-12344555-abc
    object: /test.txt
    src: "C:\\teststore-selenium\\test.txt"
    mode: put

ansible.cfg file looks like below

log_path = /var/log/ansible.log
ansible_python_interpreter = /usr/bin/python

and hosts file looks like below



I am getting error when running the playbook. Error below

[WARNING]: log file at /var/log/ansible.log is not writeable and we cannot create it, aborting

[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug
 12 2021, 07:06:15) [GCC 8.4.1 20200928 (Red Hat 8.4.1-1)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings
can be disabled by setting deprecation_warnings=False in ansible.cfg.
ansible-playbook [core 2.11.7]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/home/ec2-user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.6/site-packages/ansible
  ansible collection location = /home/ec2-user/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/local/bin/ansible-playbook
  python version = 3.6.8 (default, Aug 12 2021, 07:06:15) [GCC 8.4.1 20200928 (Red Hat 8.4.1-1)]
  jinja version = 2.10.1
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin
redirecting (type: action) ansible.builtin.win_copy to ansible.windows.win_copy
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: role_postgres.yml ***********************************************************************************************************************
1 plays in role_postgres.yml

PLAY [all] ****************************************************************************************************************************************

TASK [postgres : Simple PUT operation] ************************************************************************************************************
task path: /etc/ansible/roles/postgres/tasks/main.yml:10
<10.x.x.5> ESTABLISH WINRM CONNECTION FOR USER: Administrator on PORT 5986 TO 10.x.x.5
EXEC (via pipeline wrapper)
EXEC (via pipeline wrapper)
Using module file /usr/local/lib/python3.6/site-packages/ansible_collections/amazon/aws/plugins/modules/aws_s3.py
Pipelining is enabled.
EXEC (via pipeline wrapper)
EXEC (via pipeline wrapper)
The full traceback is:
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/ansible/plugins/action/__init__.py", line 1176, in _parse_returned_data
    filtered_output, warnings = _filter_non_json_lines(res.get('stdout', u''), objects_only=True)
  File "/usr/local/lib/python3.6/site-packages/ansible/module_utils/json_utils.py", line 57, in _filter_non_json_lines
    raise ValueError('No start of json char found')
ValueError: No start of json char found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/ansible/executor/task_executor.py", line 158, in run
    res = self._execute()
  File "/usr/local/lib/python3.6/site-packages/ansible/executor/task_executor.py", line 582, in _execute
    result = self._handler.run(task_vars=variables)
  File "/usr/local/lib/python3.6/site-packages/ansible_collections/amazon/aws/plugins/action/aws_s3.py", line 63, in run
    result = merge_hash(result, self._execute_module(module_args=new_module_args, task_vars=task_vars, wrap_async=wrap_async))
  File "/usr/local/lib/python3.6/site-packages/ansible/plugins/action/__init__.py", line 1116, in _execute_module
    data = self._parse_returned_data(res)
  File "/usr/local/lib/python3.6/site-packages/ansible/plugins/action/__init__.py", line 1200, in _parse_returned_data
    match = re.compile('%s: (?:No such file or directory|not found)' % self._used_interpreter.lstrip('!#'))
  File "/usr/lib64/python3.6/re.py", line 233, in compile
    return _compile(pattern, flags)
  File "/usr/lib64/python3.6/re.py", line 301, in _compile
    p = sre_compile.compile(pattern, flags)
  File "/usr/lib64/python3.6/sre_compile.py", line 562, in compile
    p = sre_parse.parse(p, flags)
  File "/usr/lib64/python3.6/sre_parse.py", line 855, in parse
    p = _parse_sub(source, pattern, flags & SRE_FLAG_VERBOSE, 0)
  File "/usr/lib64/python3.6/sre_parse.py", line 416, in _parse_sub
    not nested and not items))
  File "/usr/lib64/python3.6/sre_parse.py", line 502, in _parse
    code = _escape(source, this, state)
  File "/usr/lib64/python3.6/sre_parse.py", line 362, in _escape
    raise source.error("incomplete escape %s" % escape, len(escape))
sre_constants.error: incomplete escape \u at position 2
fatal: []: FAILED! => {
    "msg": "Unexpected failure during module execution.",
    "stdout": ""

Not sure what’s wrong if you could assist. Thanks in advance

Question Answer

You’re getting complaints about the json formatting of a return value. That usually happens to me when the request is malformed. I see you have ‘test.txt’ both in the src and object fields of tasks/main.yml; check if this is redundant.

Next, I would do a direct query against AWS for that object with these credentials and see what the response is.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.