Running sudo over SSH automatically
While I don't administrate Linux hosts as part of my profession, I use a decent sized fleet of virtual machines for testing and development purposes. It's fairly common that I need to run some command remotely on dozens of machines either to change configuration or to gather some troubleshooting data. This can often just be done with a shell script to automate this across many hosts. I can use SSH key-based authentication to login remotely as a user without needing a password. Eventually I run into the problem that whatever I want to automate requires root privileges. Obviously I need to run sudo to make that happen. This is a problem because sudo wants to prompt for a password before it runs the program.
The obvious workaround for this is to just login as root over SSH. There isn't anything intrinsically wrong with this, but I generally do not recommend allowing root to be logged in as remotely over SSH. The more obvious answer is to just feed the password into sudo over standard input, which is what I believe ansible does. I don't really like that solution either as it means I can't feed other data reliably over standard input without resorting to other clever things. You can also just reconfigure sudo on each host to not require a password, but I don't recommend that either.
This leaves only one real option that I am aware of. The sudo program can be instructed to ask for a password with the --askpass command line argument. When you do this, it checks the SUDO_ASKPASS environmental variable for a program to ask the user for a password. This helps us because the expectation is that program writes the supplied password to standard output. So the "askpass" program can just be a script that immediately outputs the password.
The answer is seemingly obvious: run sudo remotely and specify --askpass. Set the SUDO_ASKPASS environmental variable to the path of a script that outputs the password when invoked. The first problem with this is that I don't want a script on the hard disk of the computer to contain my password. There is no assurance with a disk as to where it is persisted and what steps are needed to remove it when done. So instead of that, we can create a script in /dev/shm. This path on the filesystem is generally writable by any user and all files are backed only in the computer's memory.
The second issue is that you can't set SUDO_ASKPASS remotely when running a program over SSH. This is a configuration of the SSH server, not the client. This is easy enough to workaround: a second script is used as a shim to first set the environmental variable and then invoke sudo.
I put all of this together into a Python script built on asyncssh. This script reads the hostname, username, password, and command to run from a CSV file on standard input. This means I can just create a list of commands that need to be run on different servers and execute it. This works like this
$ cat testfile.csv 10.233.22.108, ericu, thepassword, whoami 10.233.22.109, ericu, thepassword, whoami 10.233.22.110, ericu, thepassword, whoami 10.233.22.113, ericu, thepassword, whoami $ cat testfile.csv | python3 sshsudo_example.py root root root root
sshsudo_example.py 4.0 kB lines 14-80 shown
import asyncio import asyncssh import random import sys import time import binascii import shlex import posixpath async def run_sudo_remotely(hostname, username, password, cmd_to_run, tmp_path='/dev/shm'): # create randomly named files for the helper rng = random.Random(str(time.time()) + hostname) rand_suffix = str(binascii.hexlify(rng.randbytes(12)), encoding='ascii') tmpfilename_pw = posixpath.join(tmp_path, "sudohelper_%s_pw.sh" % (rand_suffix) ) tmpfilename_sudo = posixpath.join(tmp_path, "sudohelper_%s_sudo.sh" % (rand_suffix)) cmd_to_run = shlex.join(cmd_to_run) # passing known_hosts=None disables host key checking async with asyncssh.connect(hostname, username=username, password=password, known_hosts=None) as conn: async with conn.start_sftp_client() as sftp: # create a script to supply the password async with sftp.open(tmpfilename_pw, 'w', attrs=asyncssh.SFTPAttrs(permissions=0o700)) as fout: await fout.write("#!/bin/sh\n") await fout.write("echo ") await fout.write(shlex.quote(password)) await fout.write("\n") # create a script to run sudo async with sftp.open(tmpfilename_sudo, 'w', attrs=asyncssh.SFTPAttrs(permissions=0o700)) as fout: await fout.write("#!/bin/sh\n") await fout.write("export SUDO_ASKPASS=" + tmpfilename_pw +"\n") await fout.write("exec sudo --askpass $@ ") await fout.write(cmd_to_run) await fout.write("\n") try: # run the script that invokes sudo async with conn.create_process(tmpfilename_sudo, stdin=asyncssh.DEVNULL) as process: stdout, stderr = await process.communicate() exit_code = process.exit_status finally: # remove both scripts [await sftp.unlink(x) for x in (tmpfilename_pw, tmpfilename_sudo,)] return (exit_code, stdout, stderr) async def run_remote_commands(remote_commands): remote_commands = [run_sudo_remotely(*rc) for rc in remote_commands] return (await asyncio.gather(*remote_commands)) remote_commands = [] for line in sys.stdin: line = line.strip() if len(line) == 0 or line[0] == '#': continue line = [x.strip() for x in line.split(',')] hostname = line[0] username = line[1] password = line[2] cmd = line[3:] remote_commands.append((hostname, username, password, cmd)) remote_results = asyncio.run(run_remote_commands(remote_commands)) for result in remote_results: exit_code, stdout, stderr = result sys.stdout.write(stdout) sys.stderr.write(stderr) if exit_code != 0: sys.exit(exit_code)