python,stdout,python-multithreading
According to the thread documentation, When the main thread exits, it is system defined whether the other threads survive. On SGI IRIX using the native thread implementation, they survive. On most other systems, they are killed without executing try ... finally clauses or executing object destructors. So it is most...
You can't pass multiple arguments to function using Pool.map(). As a simple solution you can pack them into tuple. # -*- coding: utf-8 -*- from multiprocessing import Pool import os users = { 'a': '1', 'b': '2', 'c': '3' } def launcher(args): os.system("python implementation.py %s %s" % (args[0], args[1])) if...
python-2.7,python-multithreading
You're calling get with block set to True, which means it will block until an item is actually available in the queue. In your code, once the work queue is exhausted, the next get will indefinitely block since it is waiting for an additional work item that will never come...
python-3.x,python-multithreading
Python has Global Interpreter Lock. It can be your problem. Actually Python cannot do it in a genuine parallel way. You may think about switching to other languages or staying with Python but use process-based parallelism to solve your task. Here is a nice presentation Inside the Python GIL
python,multithreading,flask,python-multithreading
You better use celery (celeryproject.org) for this kind of tasks. It is heavily used in production environments, no worries of dead ends in later developments. It has all you need for managing background tasks and much much more. Here's how's to integrate it with flask: flask.pocoo.org/docs/patterns/celery
python,linux,audio,python-multithreading,alsa
I've succeeded in getting it to work with python-alsaaudio. There is no from-the-box methods for this purpose, but it's relatively easy to implement. import alsaaudio import random import struct BUFFER_SIZE = 1024 def noise_callback(): return [random.randint(-20000, 20000) for i in range(BUFFER_SIZE)] def openaudio(card_name, freq, buffer_size, callback): device = alsaaudio.PCM(card=card_name) device.setchannels(1)...
python,sleep,python-multithreading
My Solution was to find a real-time OS Actually raspberry pi have a real-time patch for the raspbian OS For more information : https://www.raspberrypi.org/forums/viewtopic.php?t=39951 Was a little of a challenge to enable it. but it is working and the error rate dropped dramatically. Thank you @holdenweb for the tip with...
python,multithreading,py2exe,pyinstaller,python-multithreading
I cant answer about py2exe ... but pyinstaller this is a known problem with a well documented work around (that has always worked for me) http://stackoverflow.com/a/27694505/541038 provides a good overview of the problem and the solution...
python,python-3.x,parameters,tkinter,python-multithreading
You should pass thread callback function arguments as a tuple or a list: thread = Thread(target= buttonDisable, args=(button,)) BTW, using after, you don't need to use thread. import time from tkinter import Tk, Button, SUNKEN, RAISED def tFunc(button): button.config(state='disable',relief=SUNKEN) root.after(5000, lambda: button.config(state='active', relief=RAISED)) # Invoke the lambda function in 5000...
python,multithreading,python-multithreading
Question 1 I think the best way is the easiest: don't subclass, just create two different threads. In the first, move the Database object, in the second the Serial one. As it, you won't make mistakes in the implementation of your sub-classed threads,and bug fixing will be quicker. Question 2...
python,parallel-processing,python-multithreading
Just use checkMouse() instead of getMouse() inside your animation loop. Simple as that, I think. while civiliansSaved < 11: for horse in horseList: if horse.leg.getX() > -187 horse.move( -1, 20 ) else: horse.move( 5, 28 ) for civilian in civiliansList: if civilian.getX() < 800: civilian.move( 20, 0 ) else: civilian.move(...
python,multithreading,python-2.7,python-multithreading
There is no direct way to do this with threading.Lock objects that I know of. Those do have a locked attribute, but that will show up as True in all threads, not just the owning thread. It's possible with RLock, but you have to access an internal __owner attribute on...
python,python-2.7,wxpython,python-multithreading
The first one works because command events automatically propagate up the containment hierarchy (a.k.a the window parent/child connections) until there is a matching binding found or until it reaches a top-level parent window like a frame or dialog. See http://wiki.wxpython.org/self.Bind%20vs.%20self.button.Bind and also http://wxpython.org/OSCON2006/wxPython-intro-OSCON2006.pdf starting at slide 53 for more explanation....
multithreading,csv,python-3.x,python-3.4,python-multithreading
I fiugured out what i have done wrong. I didnt close the file connection so it didnt write to the file. here is the code i am using now to site my csv file. fieldnames = ['ip', 'dns', 'pings'] #headings test_file = open('test2-p.csv','w', newline='') #open file csvwriter = csv.DictWriter(test_file, delimiter=',',...
python,multithreading,python-multithreading
For Python 2.7 you can try this: for i in threading.enumerate(): if i.name == "MainThread": print i.is_alive() The usage of lower camel case in function names is deprecated and so you should be using i.is_alive() instead of i.isAlive(). If you like one-liners try this: is_main_thread_active = lambda : any((i.name...
sockets,python-2.7,wxpython,python-multithreading
It appears that you are attempting to update wxPython code from a thread. This action is unsupported / undefined in wxPython. You need to use thread-safe methods to update the wxPython UI, such as wx.CallAfter or wx.PostEvent. See the following wxPython wiki page for some examples: http://wiki.wxpython.org/LongRunningTasks Basically, you'll want...
python,beautifulsoup,out-of-memory,web-crawler,python-multithreading
Each thread you start has a thread stack size, which is 8 kb by default in a Linux system (see ulimit -s), so the total number of memory needed for your threads would be more than 20 Gigabytes. You can use a pool of threads, like for example 10 threads...
python,events,python-multithreading
Pass the Event object to the thread target function so that they are shared between main thread and the pool thread: def poll_files(....., trigger): .... trigger.set() # main thread trigger = threading.Event() poll = threading.Thread(target=poll_files, args=myargs + (trigger,)) ... trigger.wait() ...
python,multithreading,python-multithreading
Actually, this is trickier than you might think. The problem is that there's no character in plain ASCII that you can use to update two separate lines of text simultaneously. \r only lets you erase one line. If you want to update two lines you will have to use a...
python,function,while-loop,python-multithreading
Okay no answers so far, so I'll try. Your main problem seems to be, that you don't know how to interchange data between threads. First the problem with the price. The loop here: while True: oanda = oandapy.API(environment="practice", access_token="xxxxxx") response = oanda.get_prices(instruments="EUR_USD") prices = response.get("prices") asking_price = prices[0].get("ask") s =...
python,file,python-2.7,python-multithreading
The way I will do it is using string methods split and strip. Those may be very useful: split breaks a string into a list based on a separator and strip removes the spaces around a string. I will use the in operator instead of find as for the case...
You are calling the function instead of just giving it as target for your thread. t2 = Thread(target=printLo) t = Thread(target=printYo) ...
python,multithreading,subprocess,python-multithreading
Your calls to subprocess.Popen() implicitly specify the default value of bufsize, 0, which forces unbuffered I/O. Try adding a reasonable buffer size (4K, 16K, even 1M) and see if it makes any difference.
Before exiting be sure to set the internal flag on the Event object: my_event_var.set() Also you can intercept KeyboardInterrupt to set the flag on ctrl-c try: xx except KeyboardInterrupt: my_event_var.set() sys.exit() ...
python,multithreading,python-multithreading,socketserver
You can kill lots of threads at the end of your program by defining them as daemon threads. To do this, set their daemon property to true. According to the documentation, This must be set before start() is called, otherwise RuntimeError is raised. Its initial value is inherited from the...
python-3.x,python-multithreading
Put all the lines before your for loop in background.py. When it is imported it will start the thread running. Change the run method to do your infinite while loop. You may also want to set daemon=True when starting the thread so it will exit when the main program exits....
python,tkinter,python-multithreading
This is how I'd do it. Well no, I'd probably wrap it into a class and instantiate the class but then I'm really just stealing whole-cloth from the linked question import tkinter as tk import customfunc def run(): root = tk.Tk() s_var = tk.StringVar() tk.Label(root, textvariable = s_var).pack() def update_time():...
python,multiprocessing,python-multithreading
You can use multiprocessing.Manager to help with this. It allows you to create a list that can be shared between processes: from functools import partial import multiprocessing def readFile(shared_variable, filename): """ Add the parent folder to the database and process the file """ path_parts = os.path.split(filename) dirname = os.path.basename(path_parts[0]) if...
python,multithreading,queue,python-multithreading
Here is a working example (for python 2.7). The program has two modes of operation: with no arguments it runs the receive loop with arguments it sends a datagram Note how r.start() and r.terminate() are called outside of the while loop in client. Also, receive has a while True loop....
pyqt4,python-multithreading,qthread
While searching for answer, qtcentre helped with it. You need to have seperate class for signals class MySignal(QtCore.QObject): sig = QtCore.pyqtSignal(list) sigStr = QtCore.pyqtSignal(str) This signal is used to communicate between threads. to communicate from main thread to worker thread, create instance of qthread in init of class where Ui...
How about you create a list of 101 elements, 0 through 100. Then store the output from processing file x into list element x. When all processing is complete, write the data in the list elements from 0 to 100 to the file.
python,python-requests,python-multithreading
If you only have one single process, then you only get one slice of the CPU. That slice is divided between your 256 threads. That's potentially a lot of context switching. use more processes to get more slices (there is a good multiprocessing module) use less threads your check_proxy implementation...
python,amazon-web-services,amazon-ec2,python-multithreading
you would want to use the multiprocess module only if you want the processes to share data in memory. That is something I would recommend ONLY if you absolutely have to have shared memory due to performance considerations. python multiprocess applications are non-trivial to write and debug. If you are...
python,pickle,python-multithreading,pyparsing,python-multiprocessing
OK, here is the solution inspired by rocksportrocker: Python multiprocessing pickling error The idea is to dill the object that can't be pickled while passing it back and forth between processes and then "undill" it after it has been passed: from multiprocessing import Pool import dill def submit_decoder_process(decoder_dill, input_line): decoder...
python,multithreading,logging,event-handling,python-multithreading
Well, after some playing around, this is the solution I came up with: def log_iterator(logger_names): ''' Get an iterator that returns LogRecord objects whenever a log message is made. logger_names can be: a string identifying which logger you want to get messages iterate from a list identifying all the loggers...
python,multithreading,locking,python-multithreading
I've decided to use the API provided by PyRO. pyro_ns_object = Pyro.naming.NameServerStarter() pyro_ns = threading.Thread(name='Pyro Name Server', target=self.__Pyro_NameServer, args=(pyro_ns_object,)) pyro_ns.setDaemon(True) self.__logger.debug('start thread def self.__Pyro_NameServer') pyro_ns.start() while True: if pyro_ns_object.waitUntilStarted(timeout=0.1): break else: self.__logger.debug('waiting pyro-ns') def __Pyro_NameServer(self, pyro_ns_object): try: self.__logger.debug('def __Pyro_NameServer')...
python,linux,performance,python-multithreading,python-multiprocessing
OK. I have the answer to my own question. Yes, it's taken me over 3 months to get this far. It appears to be GIL thrashing in Python that is the reason for the massive 'system' CPU spikes and associated pauses. Here is a good explanation of where the thrashing...
python,python-multithreading,tcpserver
The best solution I found was to use SocketServer.ForkingMixIn instead of SocketServer.ThreadingMixIn. This way the daemon actually works, even though using processes instead of threads was not exactly what I wanted....
python,multithreading,concurrency,python-multithreading
The following code maintains a 'pool' of threads (aka children). Every second, dead threads are removed, and new ones added to maintain 3 threads at all times. For a larger projects consider going to the concurrent.futures module, specifically the ThreadPoolExecutor function. I also recommend 3rd party library gevent (flexible, fast,...
python,tkinter,tk,python-multithreading
Tkinter provides an after method that allows you to schedule a command to be run after some number of milliseconds. In the after handler you can check your state, make appropriate modifications and schedule another call to the same handler. This was you never block the running UI thread but...
python,python-multithreading,temporary-files
You are ignoring the zf and sf objects in MyThread.__init__(); *args is left untouched. Because your running thread is not adding additional references to the open file object, by the time the startWorker function finishes there are no more references left to the objects and they are deleted, taking the...
python,multithreading,python-multithreading,asyncore
First of all, I would suggest not using the old asyncore module but to look into more modern and more efficient solutions: gevent, or going along the asyncio module (Python 3.4), which has been backported somehow to Python 2. If you want to use asyncore, then you have to know:...
The direct cause of your problem is in self.proc = self.thr.start(): the start() method starts a thread and has no return value. So self.proc is set to None there and self.proc.communicate(f.read()) will cause an exception. More generally the use of threads in your snippet seems a bit overkill, subprocess.Popen() in...
python,shell,subprocess,python-multithreading
Another approach (rather than the other suggestion of putting shell processes in the background) is to use multithreading. The run method that you have would then do something like this: thread.start_new_thread ( myFuncThatDoesZGrep) To collect results, you can do something like this: class MyThread(threading.Thread): def run(self): self.finished = False #...
python,subprocess,stanford-nlp,python-multithreading
Add th.join() at the end otherwise you may kill the thread prematurely before it has processed all the output when the main thread exits: daemon threads do not survive the main thread (or remove th.setDaemon(True) instead of th.join()).
You guys are going to like this one. Use threading.Event, not multiprocessing.Event. Then when you press ^C the signal handler is called just like it should! source import threading import signal class Class1(object): _stop_event = threading.Event() @staticmethod def interrupt(): Class1._stop_event.set() def wait(self): print("Waiting for _stop_event") if Class1._stop_event.wait(5): print("_stop_event set.") else:...
python,multithreading,beautifulsoup,python-multithreading
The below is a naive implementation (which will very quickly get you blacklisted from nydailynews.com): def fetch_url(i, logfile): page = urllib2.urlopen("http://www.nydailynews.com/search-results/search-results-7.113?kw=&tfq=&afq=&page={}&sortOrder=Relevance&selecturl=site&q=the&sfq=&dtfq=seven_years".format(i)) soup = BeautifulSoup(page.read()) snippet = soup.find_all('h2') for h2 in snippet: for link in h2.find_all('a'): logfile.write("http://www.nydailynews.com" + link.get('href') + "\n") print...
python,multithreading,python-multithreading
Re (1), you can make as many locks as you require -- presumably they're needed to protect different shared mutable resources in a many-threads environment (personally, I think a programming environment with many threads sharing many mutable resources is where programmers go after passing if they've led really terrible lives)....
python-2.7,random,python-multithreading
Your basic algorithm would take a long time in any event: guess a random number with a one-in-1000000 chance of the guess being correct. Your program makes things worse by your use of a list to store the previous guesses. The more failed guesses go into the list, the slower...
python,django,multithreading,python-multithreading,django-cache
The above works. As a side note, using a persistent database cache seems to be preferred over LocMemCache. # run python manage.py createcachetable CACHES = { 'default': { 'BACKEND': 'django.core.cache.backends.db.DatabaseCache', 'LOCATION': 'my_cache_table' } } The root cause of the problem was retrieving data based on an access control list, which...
multithreading,sublimetext2,sublimetext,python-multithreading,sublime-text-plugin
According to the Sublime Text 3 porting guide, the only thread-safe method in Sublime Text 2 is set_timeout: In Sublime Text 2, only the set_timeout method was thread-safe. In Sublime Text 3, every API method is thread-safe. ...
python,asynchronous,python-multithreading,asyncore
Threads would be the approach I would use. Try this: thread = threading.Thread(target = sendtextfile.Send, args = (datasend,)) thread.start() ...
python,multithreading,python-multithreading
The easy way to do this is to use Python 3.2 or later, or get the backport of the current threading to 3.1/2.7/etc. from PyPI, or just copy the code for that method from, say, 3.4's source. As the docs for Condition.wait explain: The return value is True unless a...
python,multiprocessing,python-multithreading
The error turned out to be a bad line in the input file, which broke the regular expression. For example: One of the values of the comma seperated list was: foobar=2, foo=Something here, is ,a really, poor value, bar=2 I was able to fix the problem by adding the following...
python,multithreading,sockets,queue,python-multithreading
You can write the client loop like this: print "client loop started" while True: m = q.get() print "waiting for start, got:", m if m == "start": while True: try: m = q.get(False) except: m = None print "waiting for stop, got:", m if m == "stop": break Based on...
python,multithreading,python-3.x,python-multithreading
You could use thread local storage (via the threading.local class) to store a separate file object for each thread. The say function could then look up the right file: local = threading.local() def say(*args): if not hasattr(local, "logfile"): local.logfile = open("logfile{}".format(threading.get_ident()), "a") print(*args, file=local.logfile) I'm using threading.get_ident to get a...