It sounds like you're just trying to merge the subprocess's stdout and stderr into a single pipe. To do that, as the docs explain, you just pass stderr=subprocess.STDOUT. If, on the other hand, you want to read from both pipes independently, without blocking on either one of them, then you...
python,subprocess,kill,terminate
While I don't agree at all with your design, the specific problem is here: except Exception as e: terminated = True finally: p = subprocess.Popen(['python.exe', r'D:\test.py'], shell=True) In the case that an Exception was thrown, you're setting terminated to true, but then immediately restarting the subprocess. Then, later, you check:...
python,rest,subprocess,cherrypy
If your background task is simple and isn't CPU-bound I would suggest you to use cherrypy.process.plugins.BackgroundTask. It's a thread based solution. Here's an answer with complete example. Generally in CherryPy we don't pass callbacks to go along internal components. Instead we use Plugins. CherryPy's own components like session data expiration...
python,linux,bash,subprocess,popen
Using pexpect: 2.py: import sys if sys.stdout.isatty(): print('hello') else: print('goodbye') subprocess: import subprocess p = subprocess.Popen( ['python3.4', '2.py'], stdout=subprocess.PIPE ) print(p.stdout.read()) --output:-- goodbye pexpect: import pexpect child = pexpect.spawn('python3.4 2.py') child.expect(pexpect.EOF) print(child.before) #Print all the output before the expectation. --output:-- hello Here it is with grep --colour=auto: import subprocess p...
You can use your suggested approach. Either specify the python executable in the shebang (you need both python 2 and 3 installed in parallel): #! /usr/bin/env python2 and (in your python 3 calling script): #! /usr/bin/env python3 or you can specify the interpreter when you are calling the script: output...
You could wait till all of them finish their job, and then aggregate their standard outputs. To see how it's done, see this answer which covers implementation in-depth. If you need to do it asynchronously, you should spawn a new thread for this job, and do the waiting in that...
How about running a loop to check the status of the processes? Something like this: for proc in procs: if proc.poll() is not None: # it has terminated # check returncode and handle success / failure ...
To call a Python script from another one using subprocess module and to pass it some input and to get its output: #!/usr/bin/env python3 import os import sys from subprocess import check_output script_path = os.path.join(get_script_dir(), 'a.py') output = check_output([sys.executable, script_path], input='\n'.join(['query 1', 'query 2']), universal_newlines=True) where get_script_dir() function is defined...
python-2.7,subprocess,stdout,maya
I was just trying something similar couple of days back, connecting to Deadline via the command line submitter and getting # File "C:\Program Files\Autodesk\Maya2013\bin\python26.zip\subprocess.py", line 786, in _make_inheritable # WindowsError: [Error 6] The handle is invalid error in Maya 2013.5. One workaround found here which does fix this issue is...
python,sockets,subprocess,tor,proxy-server
The issue is in DownloadYP.py - You do not have the files - C:\\rrr\japan\limit.txt I would suggest creating a dummy file in the above directory with that name, and try running the script again. Also, on a side note - You are mixing the os path separaters from unix and...
check_output() works as expected. Here's its simplified implementation in terms of Popen(): def check_output(cmd): process = Popen(cmd, stdout=PIPE) output = process.communicate()[0] if process.returncode != 0: raise CalledProcessError(process.returncode, cmd, output=output) return output grep returns 1 if it hasn't found anything i.e., you should expect the exception if Xcode is not running....
Since call needs to pass an array of command line arguments, you can split the command line yourself and call like this: subprocess.call([ "curl", "-v", "-H", "X-Auth-User: myaccount:me", "-H", "X-Auth-Key: secretpassword", "http://localhost:8080/auth/v1.0/" ], shell=False) ...
Use subprocess.check_call redirecting stdout to a file object: from subprocess import check_call, STDOUT, CalledProcessError with open("out.txt","w") as f: try: check_call(['ls', '-l'], stdout=f, stderr=STDOUT) except CalledProcessError as e: print(e.message) Whatever you what to do when the command returns a non-zero exit status should be handled in the except. If you want...
python,cmd,subprocess,stdout,stdin
The main problem is that trying to read subprocess.PIPE deadlocks when the program is still running but there is nothing to read from stdout. communicate() manually terminates the process to stop this. A solution would be to put the piece of code that reads stdout in another thread, and then...
python,python-2.7,subprocess,nmap
shlex.split returns a list; you should catenate this list with 1 element list containing strline when building command line arguments: formatCom = shlex.split(command) subprocess.check_output(formatCom + [strLine]) The error occurs because instead of subprocess.check_output([ 'nmap', '-sT', '8.8.8.8' ]) you are executing something like subprocess.check_output([ ['nmap', '-sT'], '8.8.8.8' ]) and subprocess expects...
python,linux,ssh,subprocess,python-2.6
The solution for this was to simply have the command set to the absolute path, for example, instead of mke2fs, I needed /sbin/mke2fs.
python,shell,subprocess,inter-process-communicat
Environment variables are copied from parent to child, they are not shared or copied in the other direction. All export does is make an environment variable in the child, so its children will see it. Simplest way is to echo in the child process (I'm assuming it is a shell...
python,progress-bar,subprocess,event-driven
To avoid polling subprocess' status, you could use SIGCHLD signal on Unix. To combine it with tkinter's event loop, you could use the self-pipe trick. It also workarounds the possible tkinter + signal issue without the need to wake the event loop periodically. #!/usr/bin/env python3 import logging import os import...
python,pdf,subprocess,reportlab
You are not looping over the lines in the data but over the characters. Ex: >>> data="""a ... b ... line 3""" >>> # this will print each character (as in your code) ... for line in data: print line ... a b l i n e 3 >>> >>>...
See the docs, communicate() returns the content you are looking for right when it's called. You get it when you replace the second line in your inp() method by stdout_value, stderr_value = self.proc.communicate(f.read()) Note: if you expect large amounts of data to be returned communicate() is not your best option:...
You may receive a result and pass a parameter. Terminology is not correct :) Process returns a result of invocation. It would be 0 (success) or not zero (error condition). Subprocess.Popen() is for your needs. Pass input to STDIN and get output from STDOUT. Called process must drop their results...
You would want to change your code to the following: def start_mviewer_broker(broker_path, test_name): """ The function starts the broker server""" try: print("**** start_mviewer_broker ****") return subprocess.Popen('python ' + broker_path + ' ' + test_name) # Note changed line here except: print("**** start_mviewer_broker - EXCEPTION ****") return 0 def kill_process(p): """...
Using subprocess.Popen(): >>> import subprocess >>> p = subprocess.Popen(['/your/cpp/program'], stdin=subprocess.PIPE, stdout=subprocess.PIPE) >>> p.stdin.write('1\n') >>> p.stdout.readline() '1\n' >>> p.stdin.write('10\n') >>> p.stdout.readline() '10\n' >>> p.stdin.write('0\n') >>> p.stdout.readline() '' >>> p.wait() 0 ...
Remove the stdout=subprocess.PIPE, and it should work; check_output itself captures the output, so redirecting it using stdout=subprocess.PIPE will cause problems. If you don't care about the output at all, just use subprocess.check_call (and again, don't use stdout=subprocess.PIPE).
The failing call is trying to find an executable named 'which python' not running which with an argument of python as you likely intended. The list that you pass to call (unless shell=True is set) is the list of the command and all the arguments. Doing subprocess.call(['which', 'python']) will probably...
This solution is based on -c (compile) Python interpreter cmdline argument. It ignores using shell properties entirely. Popen may be created in similar way. subprocess.call was used to show that script is in fact executed, and python interpreter return code is changed by code executed. import subprocess executable_code = """...
python,image,shell,imagemagick,subprocess
It is not subprocess that is causing any issue it is what you are passing to imagemagick that is incorrect,tostring() does get passed to imagemagick. if you actually wanted to replicate the linux command you can pipe from one process to another: from subprocess import Popen,PIPE proc = Popen(['cat', 'image.jpg'],...
Here is an example how to capture stdout, stderr and exit code with subprocess: p = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE) stdout, stderr = p.communicate() logger.info("Executed notification script %s, exit code %d", args, p.returncode) logger.info("stdout: %s", stdout) logger.info("stderr: %s", stderr) if p.returncode != 0: raise RuntimeError("The script did not exit cleanly: {}".format(args))...
You would need to escape out the backslash and also give a space between the two ffmpeg = "C:\\ffmpeg\\bin\\ffmpeg.exe " args = " -i C:\\video.mp4 -r 1 -f image2 C:\\FRAMES\\frame-%03d.jpg" However, this is not the recommended way. You should have a list which you pass as args rather than a...
Try this subprocess.call('./process "%s"' % path, shell=True) I guess problem is more with space in file name. File names with spaces in them should be enclosed in quotes like this ./process "foo bar.txt" or escaped like this ./process foo\ bar.txt....
You can certainly use Python for shell-script type stuff - with the bonus that it will be relatively portable. Another option you could consider is "BASH" "(The Bourne Again SHell). That will do everything you can do with .BAT files (and much more). Search for BASH shell scripting. Whether Python...
The subprocess.Popen requires list of strings, something like [ffmpeg, arg1, ...]. This command fails on Linux: subprocess.Popen("ls -la").wait() while this one succeeds: subprocess.Popen(["ls", "-la"]).wait() ...
It looks like a "block-buffering mode"-related issue. Run the script using python -u or add sys.stdout.flush() after print "hello world". sys.stdout is fully buffered when redirected to a file by default. print "hello world" adds the string to the stdout buffer. The buffer is not flushed if you kill the...
Since you asked the question in python you can also pipe the result p = subprocess.Popen("perl pydyn.pl %s | sort" % file, stdout=outfile,shell=True) but for this you're gonna have to make it shell=True which is not a good practice Here's one way without making it shell=True p = subprocess.Popen(["perl", "pydyn.pl",...
python-2.7,powershell,subprocess
Using single quotes inside a single quoted string breaks the string. Use double quotes outside and single qoutes inside or vice versa to avoid that. This statement: powershell -command '& {. ./uploadImageToBigcommerce.ps1; Process-Image '765377' '.jpg' 'C:\Images' 'W:\product_images\import'}' should rather look like this: powershell -command "& {. ./uploadImageToBigcommerce.ps1; Process-Image '765377' '.jpg'...
Your first parameter to subprocess.call is incorrect. It should be a list, not a string. Compare: >>> subprocess.call('echo hello') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib64/python2.7/subprocess.py", line 522, in call return Popen(*popenargs, **kwargs).wait() File "/usr/lib64/python2.7/subprocess.py", line 710, in __init__ errread, errwrite) File "/usr/lib64/python2.7/subprocess.py", line...
python,terminal,subprocess,tty
What you expect is a function that receives command as input, and returns meaningful output by running the command. Since the command is arbitrary, requirement for tty is just one of many bad cases may happen (other includes running a infinite loop), your function should only concern about its running...
python,pipe,subprocess,named-pipes
To emulate the bash process substitution: #!/usr/bin/env python from subprocess import check_call check_call('someprogram <(someprocess) <(anotherprocess)', shell=True, executable='/bin/bash') In Python, you could use named pipes: #!/usr/bin/env python from subprocess import Popen with named_pipe() as path1, named_pipe() as path2: someprogram = Popen(['someprogram', path1, path2]) with open(path1, 'wb', 0) as pipe1: someprocess =...
python,linux,file,sqlite3,subprocess
Is there a faster alternative to os.walk? Yes. In fact, multiple. scandir (which will be in the stdlib in 3.5) is significantly faster than walk. The C function fts is significantly faster than scandir. I'm pretty sure there are wrappers on PyPI, although I don't know one off-hand to...
I'm suspecting the problem is in this line: server_time_output = find_server_time(subprocess.check_output(r'net time \\{0}'.format(server))) Append rstrip() to the variable, to strip off any whitespace at the end: server_time_output = find_server_time(subprocess.check_output(r'net time \\{0}'.format(server.rstrip()))) Althouth server[:-1] will also work, if the server variable is ever input with no newline character, it'll strip off...
My question is can I either ping everything independently and not sequentially Sure. There are a variety of solutions to that problem, including both the threading and multiprocessing modules. and/or can I set a time filter of some sort on the subprocess so that, if things aren't updated after...
python,redirect,subprocess,stdout,popen
If you can't change the output filenames e.g., by passing them as a parameter to the subprocess or by specifying the output directory as a parameter then try to run the subprocesses in a different directory: from subprocess import check_call check_call(args, cwd="subdir") Make sure that args use absolute paths so...
Use the argument "universal_newlines=True" when you call the function: output = subprocess.check_output(['wmic', 'PATH', 'Win32_videocontroller', 'GET', 'description'], universal_newlines=True) print(output , "\n") ...
Use https://docs.python.org/2/library/subprocess.html#subprocess.Popen.wait p11 = subprocess.Popen('gnome-terminal -e "perl /tmp/expect',shell=False) p11.wait() ...
You should split your command string into a list of arguments: import subprocess subprocess.call(["wget", "-O", "/home/oracle/Downloads/puppet-repo.rpm", "https://yum.puppetlabs.com/puppetlabs-release-el-6.noarch.rpm"]) You could also use the shell option as an alternative: import subprocess subprocess.call("wget -O /home/oracle/Downloads/puppet-repo.rpm https://yum.puppetlabs.com/puppetlabs-release-el-6.noarch.rpm", shell=True) By the way, in python you don't need to add semicolons at the end of a...
There are at least two issues: command1 has unescaped '\t' in it. '\t' is a tab in Python -- a single character. If you want two characters (the backslash and t instead) then escape the backslash: '\\t' or use a raw string literal r'\t' e.g.: program = r'C:\ti\ccsv6\eclipse\eclipsec.exe' If you...
dd does not output anything to stdout, so your result is correct. However, it does output to stderr. Pass in stderr=subprocess.STDOUT to get the stderr output: >>> o = subprocess.check_output( ['dd', 'if=/etc/resolv.conf', 'of=r'], stderr=subprocess.STDOUT) >>> print(o) b'0+1 records in\n0+1 records out\n110 bytes (110 B) copied, 0.00019216 s, 572 kB/s\n' ...
shlex.split() syntax is different from the one used by cmd.exe (%COMSPEC%) use raw-string literals for Windows paths i.e., use r'c:\Users' instead of 'c:\Users' you don't need shell=True here and you shouldn't use it with a list argument you don't need to split the command on Windows: string is the...
Experimenting with /bin/cat, I found some things that might help you. First, of all, always use finish-output after writing to the process's input: (format (process-input *cat*) "Hello~%") (finish-output (process-input *cat*)) Otherwise, the input may not reach the subprocess until you close the input stream. If Minecraft requires input before it'll...
poll_obj.poll() is supposed to block until there is something to read in a one of registered file descriptors. That blocking that you want to prevent is a desired feature. If that is not what you want, don't use poll. Other commands probably either print something quickly (and poll does not...
I managed to fix this. Turns out the subprocess was respawning itself creating something weird which prevented python from keeping track of it. So I had to do this to fix it, However this is not the most elegant solution, and rather dangerouddangerous. Be careful if you use this, because...
Your cmd variable should be cmd = ["ls","-al" ] This is made clear in the documentation On Unix, if args is a string, the string is interpreted as the name or path of the program to execute. However, this can only be done if not passing arguments to the program....
python,python-2.7,subprocess,python-multiprocessing
The Queue object from the Queue module isn't suitable for multiprocessing: it's intended only for communication between threads in the same process. Use the Queue object from the multiprocessing module instead. from multiprocessing import Process, Queue That should solve the immediate problem. Here are a couple of other notes: out_q.get()...
For a script that does operations related to its location, you will want to first get the current directory you are in using originalDir = os.path.dirname(full_path). Next you will want to use os.chdir('/home/xxx/xxxx/xxx/xx/') and then do a subprocess.Popen("python test.py", shell=True) to run the script. Then do an os.chdir(originalDir) to get...
You can use start to do this subprocess.Popen(["start", "perl.exe", "update.pl", arg], stdin=subprocess.PIPE, shell=True) Note that some programs like notepad.exe will open in a new window but it will not create a new cmd window. You can test this approach with the following line: subprocess.Popen(["start", "dir"], stdin=subprocess.PIPE, shell=True) ...
Conclusion: Pay attention to character encodings (there are three different character encodings here). Use Python 3 if you want portable Unicode support (pass arguments as Unicode, don't encode them) or make sure that the data can be represented using current character encodings from the environment (encode using sys.getfilesystemencoding() on Python...
OK, got it. You have to pass some arguments to nmap for successfully execution, as well 'shell' should be enabled. Following command works: subprocess.check_output("nmap -V", shell=True) ...
python,subprocess,phantomjs,mocha,mocha-phantomjs
check_output() captures subprocess' stdout. Use check_call() to avoid redirecting stdout: #!/usr/bin/env python from subprocess import check_call check_call(['mocha-phantomjs', 'static/js/tests/headless.html']) ...
python,unit-testing,python-3.x,subprocess,readline
The appropriate way to control an interactive child process from Python is to use the pexpect module. This module makes the child process believe that it is running in an interactive terminal session, and lets the parent process determine exactly which keystrokes are sent to the child process. Pexpect is...
python,linux,process,subprocess
Ctrl-C at your terminal typically sends SIGINT to all processes in the foreground process group. Both your parent and your child process are in this process group. For a more detailed explanation, see for example The TTY demystified or the more technical version by Kirk McKusick at Process Groups and...
python,windows,subprocess,virtualenv
PATH is not the first place where CreateProcess() used by Popen() on Windows looks for the executable. It may use pip.exe from the same directory as the parent python.exe process. The shell (cmd.exe) uses different rules. See Popen with conflicting executable/path. To avoid the dependency; use the explicit full path...
java,python,subprocess,popen,communicate
Without more information (like some sample Java code) it's hard to be sure, but I'll bet the problem is that the Java code is waiting for a complete line, and you haven't sent one. If so, the fix is simple: output = p.communicate(input='5 5 4 3 2 1\n'.encode())[0] As a...
Use a pipe and write the data directly to that pipe: test = subprocess.Popen( 'svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin', stdout=subprocess.PIPE, universal_newlines=True) job = test.stdout.read().replace("@[email protected]", "http://localhost/svn/WernerTest/TMS/branches/test1") jenkins = subprocess.Popen( 'java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7', stdin=subprocess.PIPE,...
python,subprocess,popen,quicktime,osascript
Having quit app and Quicktime Player 7 as two list elements will transform the command subprocess.Popen executes into something like this: osascript -e 'quit app' 'Quicktime Player 7' osascript expects the parameter following -e to be "one line of a script" (see osascript's man-page). Splitting up the parameters causes osascript...
Fix the quoting mechanism: sed -i 's/\"//g' file Should be just: sed -i 's/"//g' file You can also take adventage of shlex library. Example from interpreter: >>> import shlex >>> cmd = "sed -i '1d' file" >>> shlex.split(cmd) ['sed', '-i', '1d', 'file'] >>> cmd = """sed -i 's/"//g' file""" >>>...
python,bash,subprocess,exit-code,ln
You need to use os.path.expanduser: On Unix and Windows, return the argument with an initial component of ~ or ~user replaced by that user‘s home directory. import os os.path.expanduser('~/other_folder/src/models/sc_models.py') In [2]: os.path.expanduser("~") Out[2]: '/home/padraic' Python is looking for a directory named ~ in your cwd which obviously fails. When you...
You can use os.path.expanduser(path) if you want the option of using ~. On Unix and Windows, return the argument with an initial component of ~ or ~user replaced by that user‘s home directory. You should also make sure you're separating each part of the command. A common practice is to...
If you want to get rid of the shell=True you have to give the full path to the executable. import subprocess proc = subprocess.Popen('/full/path/start %s' % filename) proc.kill() ...
python-2.7,subprocess,temporary-files
Unless stdout=PIPE; p[0] will always be None in your code. To get output of a command as a string, you could use check_output(): #!/usr/bin/env python from subprocess import check_output result = check_output("date") check_output() uses stdout=PIPE and .communicate() internally. To read output from a file, you should call .read() on the...
python,osx,python-2.7,subprocess
That's because MacOS X desktop applications are actually directories. The executable is buried inside. This works: subprocess.Popen(['/Applications/Calculator.app/Contents/MacOS/Calculator']) ...
To implement sh's &, avoid cargo cult programming and use subprocess module directly: import subprocess etcd = subprocess.Popen('etcd') # continue immediately next_cmd_returncode = subprocess.call('next_cmd') # wait for it # ... run more python here ... etcd.terminate() etcd.wait() This ignores exception handling and your talk about "daemon mode" (if you want...
python,python-3.x,streaming,subprocess,wsgi
If none of the child processes tries to read from stdin then the only reason for the deadlock that I see in your code is that .write(chunk), .read(chunk_size) may go out of sync if convert_process does not return byte for byte (if .flush() after .write(chunk) does not help). To emulate...
This is specifically for running the python script as a commandline process, but I eventually got this working by combining two answers that people suggested. Using the combination of DETACHED_PROCESS suggested in this answer worked for running it through IDLE, but the commandline interface. But using shell=True (as ajsp suggested)...
The bug is actually in your C++ program; it re-uses the in variable for both of its prints, which means if the second call to getline doesn't return anything (and it doesn't in your case, because the EOF of stdin is reached after the first getline call), the contents returned...
windows,python-3.x,subprocess,clipboard
As suggested by @eryksun, this solves the issue: p = subprocess.Popen('clip.exe', stdin=subprocess.PIPE, stdout=subprocess.PIPE, universal_newlines=True) p.communicate('hello \n world') p.wait() ...
Don't PIPE just call check_output passing a list of args and remove shell=True: out = subprocess.check_output(["curl", "-k","--data", etree.tostring(tree)+"@SampleRequest.xml", "-v", "https://world-service-dev.intra.aexp.com:4414/worldservice/CLIC/CaseManagementService/V1"]) If you get a non-zero exit code you will get a CalledProcessError....
python,subprocess,converter,libreoffice
This is the code you should use: subprocess.call(['soffice', '--headless', '--convert-to', 'txt:Text', 'document_to_convert.doc']) This is the same line you posted, without the quotes around txt:Text. Why are you seeing the error? Simply put: because soffice does not accept txt:"Text". It only accepts txt:Text. Why is it working on the shell? Your...
python,windows,subprocess,stderr
As discussed in chat, the numbers are coming from stderr. By printing the ascii-indexes of each character in line, we discovered that the final line returned by readline() is \t75,881,728 \r175,882,240 \r\n. It looks like the \r embedded in the middle of this string (which DD outputs) is confusing your...
This is basically because of the buffering that is there normally at whatever your cmd program has. You have to disable the default buffering happening at that program in order to attain what you are looking for. In case it is a python file you are running through the cmd...
python,timeout,subprocess,popen
You can use the timeout or waitmax commands to set a time limit on the process you are running with Popen. For instance, to run a tail -f command for a maximum of 10 seconds - import subprocess process=subprocess.Popen(['timeout' ,'10', 'tail', '-f', '/var/log/syslog'], stdout=subprocess.PIPE) out,err = process.communicate() print out Apr...
python,django,multithreading,sockets,subprocess
After reading the documentation of subprocess library I found the flag: close_fds. According to documentation: If close_fds is true, all file descriptors except 0, 1 and 2 will be closed before the child process is executed. (Unix only). Or, on Windows, if close_fds is true then no handles will be...
python,subprocess,stanford-nlp,python-multithreading
Add th.join() at the end otherwise you may kill the thread prematurely before it has processed all the output when the main thread exits: daemon threads do not survive the main thread (or remove th.setDaemon(True) instead of th.join()).
python,subprocess,file-descriptor,pty
If the program does not generate much output; the simplest way is to use pexpect.run() to get its output via pty: import pexpect # $ pip install pexpect output, status = pexpect.run('top', timeout=2, withexitstatus=1) You could detect whether the output is "settled down" by comparing it with the previous output:...
python,cron,raspberry-pi,subprocess
subprocess.call() does not run the shell (/bin/sh) by default. You either need to emulate multiple pipes in Python or just pass shell=True: #!/usr/bin/env python from subprocess import check_call check_call("(crontab -l; echo '* * * * * ls -l | tee tresults.txt') | " "sort - | uniq - | crontab...
I think this would solve your issue : Python specify popen working directory via argument I suppose in the "./redist/PyMySQL/" directory could be used as parameter because it is where the setup.py is located try this : subprocess.Popen("py3 setup.py", cwd='/redist/PyMySQL/') on my end this works : subprocess.Popen(['py3','setup.py'], cwd='path-of-my-setup') ...
My syntax was wrong. This Q&A gets it straight. Subprocess Popen and PIPE in Python So the command is: Popen(['curl', '-s', 'http://download.finance.yahoo.com/d/quotes.csv?s=vwrl.as&f=l1'], stdout=PIPE).communicate()[0] ...
Globbing (expanding the *) is a function of your shell. You need to add the shell=True parameter to execute the command through a shell interpreter. subprocess.call("ls output*", shell=True) ...
python,exception-handling,subprocess
Starting from Python 3.2 Popen is a context manager. from the docs: Popen objects are supported as context managers via the with statement: on exit, standard file descriptors are closed, and the process is waited for. This should do pretty much what you want. This is the relevant part from...
You are doing this backwards, and shouldn't be using the child process to kill the parent process. Instead, you will want a parent process of your "perpetually running" script (which will now be the subprocess). When an update is detected, the subprocess kills itself, and requests that the parent implement...
For 2>/dev/null, the appropriate way to control redirection of file descriptor 2 with the subprocess.Popen family of calls is stderr=: # Python 2.x, or 3.0-3.2 output = subprocess.check_output(['du', '-g', '-d1', '/Users'], stderr=open('/dev/null', 'w')) ...or, with a Python supporting subprocess.DEVNULL: # Python 3.3 or newer output = subprocess.check_output(['du', '-g', '-d1', '/Users'],...