nd = serializer.loads(command.value)
File "/home/ubuntu/Download/spark-2.1.1/python/pyspark/serializers.py", line 454, in loads
return pickle.loads(obj)
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/distkeras/workers.py", line 13, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/distkeras/utils.py", line 5, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/init.py", line 3, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/activations.py", line 4, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/backend/init.py", line 73, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 1, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/tensorflow/init.py", line 24, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/tensorflow/python/init.py", line 54, in
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/tensorflow/core/framework/graph_pb2.py", line 6, in
from google.protobuf import descriptor as _descriptor
ImportError: No module named google.protobuf
1.确定需要导入的库是否正确安装
2.PYTHONPATH中是否指定路径
export PYTHONPATH=/home/ubuntu/Download/spark-2.1.1/python:/home/ubuntu/anaconda2/lib/python2.7/site-packages
There is another possibility, if you are running a python 2.7.11 or other similar versions,
sudo pip install protobuf
is ok. But if you are in a anaconda environment, you should use
sudo conda install protobuf