Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行爬取CSDN示例代码时,出现RocksDBException,Failed to create a directory: C:\code\weibocrawler\crawl\crawldb: ϵͳÕҲ»µ½ָ¶ #117

Open
jack13163 opened this issue Nov 29, 2019 · 3 comments

Comments

@jack13163
Copy link

jack13163 commented Nov 29, 2019

Exception in thread "main" org.rocksdb.RocksDBException: Failed to create a directory: C:\code\weibocrawler\crawl\crawldb: ϵͳÕҲ»µ½ָ¶
at org.rocksdb.RocksDB.open(Native Method)
at org.rocksdb.RocksDB.open(RocksDB.java:231)
at cn.edu.hfut.dmic.webcollector.plugin.rocks.RocksDBUtils.open(RocksDBUtils.java:94)
at cn.edu.hfut.dmic.webcollector.plugin.rocks.RocksDBUtils.openCrawldbDatabase(RocksDBUtils.java:60)
at cn.edu.hfut.dmic.webcollector.plugin.rocks.RocksDBManager.inject(RocksDBManager.java:87)
at cn.edu.hfut.dmic.webcollector.crawldb.DBManager.inject(DBManager.java:66)
at cn.edu.hfut.dmic.webcollector.crawler.Crawler.inject(Crawler.java:73)
at cn.edu.hfut.dmic.webcollector.crawler.Crawler.start(Crawler.java:114)
at cn.edu.hfut.dmic.webcollector.crawler.AutoParseCrawler.start(AutoParseCrawler.java:62)
at cn.edu.hfut.dmic.webcollector.example.TutorialCrawler.main(TutorialCrawler.java:90)

依赖已加入:

        <dependency>
            <groupId>org.rocksdb</groupId>
            <artifactId>rocksdbjni</artifactId>
            <version>5.17.2</version>
        </dependency>```
@hujunxianligong
Copy link
Member

hujunxianligong commented Nov 29, 2019 via email

@jack13163
Copy link
Author

感谢您的回复,找到问题了,是导入包时出现了问题:

之前使用的是rocksdb,存在上述问题,导入的包如下:
import cn.edu.hfut.dmic.webcollector.plugin.rocks.BreadthCrawler;
查看了之前的正确实例,发现使用berkeley就没有问题了
import cn.edu.hfut.dmic.webcollector.plugin.berkeley.BreadthCrawler;

@Edward1428
Copy link

我也是报这个错,对于DemoSeleniumCrawler怎么使用BreadthCrawler呢,求助,需要取js渲染后的数据

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants