我曾尝试将sys.path.append()与os.getcwd()一起使用,但没有成功。在
源代码来自here,我已经下载并提取了它们:[email protected]:~/test$ wget https://github.com/alvations/DLTK/archive/master.zip
[email protected]:~/test$ tar xvzf master.zip
[email protected]:~/test$ cd DLTK-master/; ls
dltk
[email protected]:~/test/DLTK-master$ cd dltk/; ls
tokenize
[email protected]:~/test/DLTK-master/dltk$ cd tokenize/; ls
abbrev.lex jwordsplitter-3.4.jar rbtokenize.pl
banana-split-standalone-0.4.0.jar koehn_senttokenize.pl splicer.py
igerman98_all.xml koehn_wordtokenize.pl tokenizer.py
__init__.py nonbreaking_prefix.de
[email protected]:~/test/DLTK-master/dltk/tokenize$ cat __init__.py
from tokenizer import punct_tokenize, rb_tokenize
from tokenizer import koehn_tokenize, deupunkt_tokenize
from splicer import jwordsplitter, jwordsplitteralvas
这些是我想从~/text/目录访问的函数,例如koehn_tokenize函数。但我似乎无法将模块/函数添加到python解释器中。在
^{pr2}$
从位于~/test/目录的python解释器中,如何访问dltk.tokenize模块?
如果将cd转换为~/test/DLTK-master/dltk/tokenize,则函数有效:[email protected]:~/test$ cd DLTK-master/dltk/tokenize/
[email protected]:~/test/DLTK-master/dltk/tokenize$ python
Python 2.7.5+ (default, Sep 19 2013, 13:48:49)
[GCC 4.8.1] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from __init__ import koehn_tokenize
>>>
但是我不想在使用python解释器之前将cd放入~/test/DLTK-master/dltk/tokenize。我需要在python中添加模块/函数。在