Skip to content
DotnetSpider, a .NET Standard web crawling library. It is lightweight, efficient and fast high-level web crawling & scraping framework
C# HTML CSS Other
Branch: master
Clone or download
Latest commit 0a56f8c Aug 29, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
docker
docs add pictures Feb 23, 2017
images update design image Jul 4, 2019
src
tests/DotnetSpider.Tests formate code Aug 29, 2019
.editorconfig 添加indent_style规则 Jul 29, 2019
.gitignore 实现 PORTAL Apr 9, 2019
DESIGN.md 添加分配下载请求逻辑 Apr 7, 2019
Directory.Build.props 抽离MySql作为单独项目 Jul 31, 2019
DotnetSpider.sln add publish script Aug 28, 2019
DownloaderAgent.sln impl portal Jul 9, 2019
DownloaderAgentRegisterCenter.sln pre release Jul 10, 2019
LICENSE.txt
README.md 1. impl create table for hbase Aug 28, 2019
azure-pipelines.yml IGNORE kafka test cases Apr 1, 2019
build.sh start new version Mar 30, 2019
package.props
publish_agent.sh add publish script Aug 28, 2019
publish_image.sh add publish script Aug 28, 2019
publish_myget.sh add publish script Aug 28, 2019
publish_portal.sh add publish script Aug 28, 2019
publish_startup.sh add publish script Aug 28, 2019
runtests.sh 1. 优化功能 Jul 15, 2019
yarn.lock upgrade packages Aug 28, 2019

README.md

DotnetSpider

Build Status NuGet Member project of .NET Core Community GitHub license

DotnetSpider, a .NET Standard web crawling library. It is lightweight, efficient and fast high-level web crawling & scraping framework

DESIGN

DESIGN IMAGE

DEVELOP ENVIROMENT

  1. Visual Studio 2017 (15.3 or later) or Jetbrains Rider

  2. .NET Core 2.2 or later

  3. Docker

  4. MySql

     docker run --name mysql -d -p 3306:3306 --restart always -e MYSQL_ROOT_PASSWORD=1qazZAQ! mysql:5.7
    
  5. Redis (option)

     docker run --name redis -d -p 6379:6379 --restart always redis
    
  6. SqlServer

     docker run --name sqlserver -d -p 1433:1433 --restart always  -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=1qazZAQ!' mcr.microsoft.com/mssql/server:2017-latest
    
  7. PostgreSQL (option)

     docker run --name postgres -d  -p 5432:5432 --restart always -e POSTGRES_PASSWORD=1qazZAQ! postgres
    
  8. MongoDb (option)

     docker run --name mongo -d -p 27017:27017 --restart always mongo
    
  9. Kafka

    docker run -d --restart always --name kafka-dev -p 2181:2181 -p 3030:3030 -p 8081-8083:8081-8083 \
           -p 9581-9585:9581-9585 -p 9092:9092 -e ADV_HOST=192.168.1.157 \
           landoop/fast-data-dev:latest
    
  10. Docker remote api for mac

    docker run -d  --restart always --name socat -v /var/run/docker.sock:/var/run/docker.sock -p 2376:2375 bobrik/socat TCP4-LISTEN:2375,fork,reuseaddr UNIX-CONNECT:/var/run/docker.sock
    
  11. HBase

    docker run -d --restart always --name hbase -p 20550:8080 -p 8085:8085 -p 9090:9090 -p 9095:9095 -p 16010:16010 dajobe/hbase                           
    

MORE DOCUMENTS

https://github.com/dotnetcore/DotnetSpider/wiki

SAMPLES

Please see the Project DotnetSpider.Sample in the solution.

BASE USAGE

Base usage Codes

ADDITIONAL USAGE: Configurable Entity Spider

View complete Codes

public class EntitySpider : Spider
{
	public EntitySpider(SpiderParameters parameters) : base(parameters)
	{
	}
	
	protected override void Initialize()
	{
		NewGuidId();
		Scheduler = new QueueDistinctBfsScheduler();
		Speed = 1;
		Depth = 3;
		AddDataFlow(new DataParser<CnblogsEntry>()).AddDataFlow(GetDefaultStorage());
		AddRequests(
			new Request("https://news.cnblogs.com/n/page/1/", new Dictionary<string, string> {{"网站", "博客园"}}),
			new Request("https://news.cnblogs.com/n/page/2/", new Dictionary<string, string> {{"网站", "博客园"}}));
	}

	[Schema("cnblogs", "news")]
	[EntitySelector(Expression = ".//div[@class='news_block']", Type = SelectorType.XPath)]
	[GlobalValueSelector(Expression = ".//a[@class='current']", Name = "类别", Type = SelectorType.XPath)]
	[FollowSelector(XPaths = new[] {"//div[@class='pager']"})]
	public class CnblogsEntry : EntityBase<CnblogsEntry>
	{
		protected override void Configure()
		{
			HasIndex(x => x.Title);
			HasIndex(x => new {x.WebSite, x.Guid}, true);
		}

		public int Id { get; set; }

		[Required]
		[StringLength(200)]
		[ValueSelector(Expression = "类别", Type = SelectorType.Enviroment)]
		public string Category { get; set; }

		[Required]
		[StringLength(200)]
		[ValueSelector(Expression = "网站", Type = SelectorType.Enviroment)]
		public string WebSite { get; set; }

		[StringLength(200)]
		[ValueSelector(Expression = "//title")]
		[ReplaceFormatter(NewValue = "", OldValue = " - 博客园")]
		public string Title { get; set; }

		[StringLength(40)]
		[ValueSelector(Expression = "GUID", Type = SelectorType.Enviroment)]
		public string Guid { get; set; }

		[ValueSelector(Expression = ".//h2[@class='news_entry']/a")]
		public string News { get; set; }

		[ValueSelector(Expression = ".//h2[@class='news_entry']/a/@href")]
		public string Url { get; set; }

		[ValueSelector(Expression = ".//div[@class='entry_summary']", ValueOption = ValueOption.InnerText)]
		public string PlainText { get; set; }

		[ValueSelector(Expression = "DATETIME", Type = SelectorType.Enviroment)]
		public DateTime CreationTime { get; set; }
	}
}

Distributed spider

Read this document

WebDriver Support

When you want to collect a page JS loaded, there is only one thing to do, set the downloader to WebDriverDownloader.

Downloader = new WebDriverDownloader(Browser.Chrome);

NOTE:

  1. Make sure the ChromeDriver.exe is in bin folder when use Chrome, install it to your project from NUGET: Chromium.ChromeDriver
  2. Make sure you already add a *.webdriver Firefox profile when use Firefox: https://support.mozilla.org/en-US/kb/profile-manager-create-and-remove-firefox-profiles
  3. Make sure the PhantomJS.exe is in bin folder when use PhantomJS, install it to your project from NUGET: PhantomJS

NOTICE

when you use redis scheduler, please update your redis config:

timeout 0
tcp-keepalive 60

Buy me a coffee

AREAS FOR IMPROVEMENTS

QQ Group: 477731655 Email: zlzforever@163.com

You can’t perform that action at this time.