Python 带pyspark的多项式HMM?
我一直在尝试在pyspark中实现我的hmm模型,但到目前为止还没有找到任何示例,所以我可以用我的代码实现它。 有人能帮我吗?这是我的.py代码Python 带pyspark的多项式HMM?,python,apache-spark,machine-learning,pyspark,hidden-markov-models,Python,Apache Spark,Machine Learning,Pyspark,Hidden Markov Models,我一直在尝试在pyspark中实现我的hmm模型,但到目前为止还没有找到任何示例,所以我可以用我的代码实现它。 有人能帮我吗?这是我的.py代码 import numpy as np from hmmlearn import hmm states = ['DET','ADJ','NN','V'] n_states = len(states) observations = ['the','a','green','big','old','might','book','plants','peopl
import numpy as np
from hmmlearn import hmm
states = ['DET','ADJ','NN','V']
n_states = len(states)
observations = ['the','a','green','big','old','might','book','plants','people','person','John','wash','washes','loves','reads','books']
n_observations = len(observations)
start_probability = np.array([0.5,0.1,0.3,0.1])
transition_probability = np.array([
[0, 0, 0, 0.5],
[0.3, 0.2, 0.1, 0.2],
[0.7, 0.7, 0.4, 0.2],
[0, 0.1, 0.5, 0.1]
]).T
emission_probability = np.array([
[0.7, 0, 0, 0],
[0.3, 0, 0, 0],
[0, 0.1, 0, 0],
[0, 0.4, 0, 0],
[0, 0.4, 0, 0],
[0, 0.1, 0, 0.2],
[0, 0, 0.3, 0],
[0, 0, 0.2, 0],
[0, 0, 0.2, 0],
[0, 0, 0.1, 0],
[0, 0, 0.1, 0],
[0, 0, 0.1, 0.3],
[0, 0, 0, 0.2],
[0, 0, 0, 0.1],
[0, 0, 0, 0.19],
[0, 0, 0, 0.01]
]).T
model = hmm.MultinomialHMM(n_components=n_states, init_params="")
model.startprob_=start_probability
model.transmat_=transition_probability
model.emissionprob_=emission_probability
# predict a sequence of hidden states based on visible states
bob_says = np.array([[10,5,11]]).T #transpose
#model = model.fit(bob_says)
logprob, alice_hears = model.decode(bob_says, algorithm="viterbi")
print("Bob says:", ", ".join(map(lambda x: observations[x], bob_says.T[0])))
print("Alice hears:", ", ".join(map(lambda x: states[x], alice_hears)))
我已经阅读了gmm学习的文档,但似乎无法实现hmm学习。
提前谢谢 我在你的代码中没有看到任何与Spark或PySpark相关的内容。也许你应该删除这些标签,或者详细说明如何使用Spark的功能来解决你的问题。我在你的代码中没有看到任何与Spark或PySpark相关的东西。也许您应该删除这些标签,或者详细说明如何使用Spark的功能来解决您的问题。