Python 3.x fcntl文件锁定示例不工作
我试图在Python3.4.3中编写一个锁定文件的玩具示例,但是没有得到预期的结果 我有两个脚本,script1.py和script2.py:Python 3.x fcntl文件锁定示例不工作,python-3.x,Python 3.x,我试图在Python3.4.3中编写一个锁定文件的玩具示例,但是没有得到预期的结果 我有两个脚本,script1.py和script2.py: #script1.py import pickle import pandas as pd import numpy as np import time import fcntl df = pd.DataFrame.from_dict({"script_id": [0], "val1": [0], "val2": [0]}) df.to_pickle(
#script1.py
import pickle
import pandas as pd
import numpy as np
import time
import fcntl
df = pd.DataFrame.from_dict({"script_id": [0], "val1": [0], "val2": [0]})
df.to_pickle("data.pkl")
for i in range(500):
f = open("data.pkl", "rb+")
while True:
try:
# lock if unlocked
fcntl.flock(f, fcntl.LOCK_EX | fcntl.LOCK_NB)
break
except:
time.sleep(0.01)
df.loc[i, :] = np.concatenate([np.array([1]), np.random.sample(2)])
time.sleep(np.random.uniform(0, 0.05))
pickle.dump(df, f)
# unlock when done
fcntl.flock(f, fcntl.LOCK_UN)
f.close()
第二个脚本非常相似:
import pickle
import pandas as pd
import numpy as np
import time
import fcntl
f = open("data.pkl", "rb")
while True:
try:
fcntl.flock(f, fcntl.LOCK_EX | fcntl.LOCK_NB)
df = pickle.load(f)
fcntl.flock(f, fcntl.LOCK_UN)
f.close()
break
except:
time.sleep(0.001)
for i in range(500, 1000):
f = open("data.pkl", "rb+")
while True:
try:
# lock if unlocked
fcntl.flock(f, fcntl.LOCK_EX | fcntl.LOCK_NB)
break
except:
time.sleep(0.01)
df.loc[i, :] = np.concatenate([np.array([2]), np.random.sample(2)])
time.sleep(np.random.uniform(0, 0.05))
pickle.dump(df, f)
# unlock when done
fcntl.flock(f, fcntl.LOCK_UN)
f.close()
其思想是两个脚本应该读写同一个文件,并有一些人为的延迟
每个脚本向从data.pkl加载的数据帧中添加一个随机行,其总和应为1000行
我首先运行script1
,然后尽可能快地运行script2
。最后是500+n
行数据帧,其中n
是运行script2之前附加的行数
为什么我在一个玩具的例子中使用熊猫和numpy?我将有一个类似的用例,因此我想确保它与我将要使用的对象一起工作。这是解决方案,以防其他人遇到此问题 我创建了第二个文件
lock.lck
,它在我执行操作和写入data.pkl
文件时被锁定。写入data.pkl
后,锁被释放
# script1.py
import pandas as pd
import numpy as np
import time
import fcntl
df = pd.DataFrame.from_dict({"script_id": [0], "val1": [0], "val2": [0]})
df.to_pickle("data.pkl")
for i in range(500):
with open("lock.lck", "r") as f_lock:
fcntl.flock(f_lock, fcntl.LOCK_EX)
time.sleep(np.random.uniform(0, 0.05))
df = pd.read_pickle("data.pkl")
df.loc[i, :] = np.concatenate([np.array([1]), np.random.sample(2)])
df.to_pickle("data.pkl")
现在是第二个脚本:
# script2.py
import pandas as pd
import numpy as np
import time
import fcntl
for i in range(500,1000):
with open("lock.lck") as f_lock:
fcntl.flock(f_lock, fcntl.LOCK_EX)
time.sleep(np.random.uniform(0, 0.05))
df = pd.read_pickle("data.pkl")
df.loc[i, :] = np.concatenate([np.array([2]), np.random.sample(2)])
df.to_pickle("data.pkl")