UCRAWLER BUSINESS HELP CENTER

How uCrawler works

uCrawler automatically scrapping articles from any news sources and delivering JSON via API or directly in your DataBase of your website or mobile app.

uCrawler is completely automated. All you need - add news sources.
UCRAWLER HELP CENTER
Account Tutorial

1
Dashboard
Dashboard contains a key information and statuses of your sources
Spider is running. Please wait...
uCrawler spider collects news from your sources every 5-20 minutes (depends on configuration). On Demo account - every 12 hours.

When the spider is running, you cannot use the interface until the process is finished. It may take from 1-20 min and depends on a number of sources. We block the interface to prevent any data-collecting interruptions.

You can Start / Stop uCrawler Spider when you work in your account (check Spider tutorial). Please, don't forget to start Spider again.

2
Sources
You can add groups and news sources on Dashboard or Groups pages. Sources may contain articles, PDF, Word, Excel, PowerPoint files or Telegram channels. Or just send us a list and we will add them on short notice.
How to Create new Group
1
Click on "Add new Group" button on Groups or Dashboard pages
2
Fill "Group name" and "Result limiter" (maximum number of news topics you will get via API) fileds
3
Click "Save"
How to Create new Source
1
Click on "Add new Source" button inside your Group or Dashboard pages

2
Add web page URL or RSS link to the "Start Page" field. And click on Save button.
Use Choose news links for non-RSS URLs only. Click on Choose news links button and point on desired news blocks with the visual selector. Selected links will be highlighted with green color. Just close popup after you finish link selecting.
3
Click on Run spider button on the left to test news scraping.
It may take up to 3 min.
4
Check the collected data by clicking on "HTML samples" button.
Visual and JSON preview of collected data

3
Filters
Filter the newsfeed, based on any set of sources, by using keywords powered by integrated ElasticSearch
1
To create a new filter click on "Filters" in the main menu and click on "Create new filter".

2
Add Filter name to the "Group name" field
3
Add comma separated keywords to the "Keywords" field
4
Configure filter options
1) Minimum number of keyword occurrences (1 is a default value)
2) Search only in title (On / Off)
5
Click "Save"
You can preview results immediately by clicking on "JSON view" or "Demo page"

4
Spider
Spider automatically collects news from sources every 20 min (can be reduced by request). You can Start / Stop spider daemon.

Don't forget to start Spider again.

5
Account
You can control User accounts and API keys here.

6
Export
You can connect you ElasticSearch or SQL Database for automatic data export. Please contact us public@ucrawler.app to setup this feature.

7
Logs
Log files with detailed information about Spider work

8
API Docs
uCrawler API documentation