I have a working script script.sh python /opt/facenet/src/train.py . But when I try to run it in cron, I get the following error: ImportError: No module named tensorflow . I tried to fix it this way: import tensorflow tensorflow.__file__ /home/user/anaconda3/lib/site-packages/tensorflow/__init__.pyc I changed the library path and ran the script like this: /home/user/anaconda3/lib/site-packages /opt/facenet/src/train.py , and then received the following error: /home/user/anaconda3/lib/site-packages: Permission denied . I tried to fix this with chmod, but it did not help either. I was asked to run the script like this: /home/user/anaconda3/bin/python /opt/facenet/src/train.py , but again I get the error: ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory . Any idea what could be wrong?

  • Permission denied ? Try running like sudo - gil9red
  • I think that you were correctly advised to run the script using a python from the anaconda environment /home/user/anaconda3/bin/python /opt/facenet/src/train.py . Try to solve the problem with libcublas.so.9.0 . Take a look at this question at stackoverflow.com/questions/48428415/… - Andrey
  • are you using VirtualEnv ? - MaxU
  • @Andrey thanks, helped) - arti_lina February

1 answer 1

I have seen similar problems when working with VirtualEnv . In this case, it will be convenient to create a small file in which all environment variables are set, for example:

Suppose you created Python VirtualEnv and named it ml (Machine Learning).

To work in scripts, you can create an environment file (let's call it: $ HOME / .ml_env):

 export PYTHONPATH=/path/to/my/own/python_libs export LD_LIBRARY_PATH=$HOME/anaconda3/lib:$LD_LIBRARY_PATH export PATH=$HOME/anaconda3/bin:$PATH:$HOME/bin 

Then in the shell scripts add a trace. lines:

 #!/bin/bash source $HOME/.ml_env conda activate ml