在Linux中使用python同时执行多个命令
我需要在Linux中同时使用python执行多个命令。 我不需要一个命令一个命令地运行它 我试图写这段代码,但我不明白如何使用python同时执行多个命令。我也读过python多线程,但我不知道如何使用它 代码:在Linux中使用python同时执行多个命令,python,linux,python-multithreading,Python,Linux,Python Multithreading,我需要在Linux中同时使用python执行多个命令。 我不需要一个命令一个命令地运行它 我试图写这段代码,但我不明白如何使用python同时执行多个命令。我也读过python多线程,但我不知道如何使用它 代码: 请问我如何使用python同时执行多个命令???看起来像一个典型的生产者-消费者问题 import threading import os commands = ['ping www.google.com', 'ping www.yahoo.com', 'ping www.hotma
请问我如何使用python同时执行多个命令???看起来像一个典型的生产者-消费者问题
import threading
import os
commands = ['ping www.google.com', 'ping www.yahoo.com', 'ping www.hotmail.com']
def worker_func():
while commands: # Checks if the list is not-empty. Loop exits when list is becomes empty
com = commands.pop(0)
print "Start execute commands.."
os.system(com)
count += 1
print "[OK] command "+str(count)+" runing successfully."
workers = [threading.Thread(target=worker_func, args=tuple(), name='thread_'+str(i)) for i in range(5) ] # Create 5 workers (consumers)
[worker.start() for worker in workers] # Start working
[worker.join() for worker in workers] # Wait for all workers to finish
在这里,我创建了5个工作线程。这些线程将运行函数worker\u func
worker\u func
将从列表中选择一个元素并执行作业。当列表变为空时,函数返回(退出)
注意:了解不应在何处使用python多线程。在这种情况下,GIL(全局解释器锁)不会影响您,因为
worker\u func
调用子进程并等待它完成。当线程等待时,GIL被释放到其他线程。我建议两种解决方案,但有很多
import threading
import os
def ping_url(number):
os.system(number)
thread_list = []
commands = ['ping www.google.com', 'ping www.yahoo.com', 'ping www.hotmail.com']
for url in commands:
# Instantiates the thread
t = threading.Thread(target=print_number, args=(url,))
# Sticks the thread in a list so that it remains accessible
thread_list.append(t)
# Starts threads
for thread in thread_list:
thread.start()
# This blocks the calling thread until the thread whose join() method is called is terminated.
# From http://docs.python.org/2/library/threading.html#thread-objects
for thread in thread_list:
thread.join()
# Demonstrates that the main process waited for threads to complete
print "Done"
简单解决方案:
在命令末尾使用&
在后台运行命令:
commands = ['ping www.google.com &', 'ping www.yahoo.com &', 'ping www.hotmail.com &']
for com in commands:
os.system(com) # now commands will run in background
线程+队列解决方案,可控制要生成的最大线程数:
from Queue import Queue, Empty
import threading, os
def worker_func():
while not stopped.is_set():
try:
# use the get_nowait() method for retrieving a queued item to
# prevent the thread from blocking when the queue is empty
com = q.get_nowait()
except Empty:
continue
try:
os.system(com)
except Exception as e:
print "[-] Error running command %s" %(str(e))
finally:
q.task_done()
commands = ['ping www.google.com', 'ping www.yahoo.com', 'ping www.hotmail.com']
thread_count = 4 # maximum parallel threads
stopped = threading.Event()
q = Queue()
print "-- Processing %s tasks in thread queue with %s thread limit" %(str(len(commands)), str(thread_count))
for item in commands:
q.put(item)
for i in range(thread_count):
t = threading.Thread(target=worker_func)
# t.daemon = True #Enable to run threads as daemons
t.start()
q.join() # block until all tasks are done
stopped.set()
我的解决方案不会启动额外的线程。
我使用
subprocess.Popen
运行命令,将Popen
对象存储在第一个循环的列表中,然后等待第二个循环中的子进程完成
from subprocess import Popen, PIPE
commands = ['ping www.google.com', 'ping www.yahoo.com', 'dir']
count = 0
processes = []
for com in commands:
print "Start execute commands.."
processes.append(Popen(com, shell=True))
count += 1
print "[OK] command "+str(count)+" running successfully."
else:
print "Finish.."
for i, process in enumerate(processes):
process.wait()
print "Command #{} finished".format(i)
您需要subprocess.Popen和系统块
from subprocess import Popen, PIPE
commands = ['ping www.google.com', 'ping www.yahoo.com', 'dir']
count = 0
processes = []
for com in commands:
print "Start execute commands.."
processes.append(Popen(com, shell=True))
count += 1
print "[OK] command "+str(count)+" running successfully."
else:
print "Finish.."
for i, process in enumerate(processes):
process.wait()
print "Command #{} finished".format(i)