What is Kaspian

Kaspian is a big data delivery platform which deals with a vast data lake and classifies it into meaningful streams. It makes it quick and easy for customers to get the information they are looking for without losing track and spending long hours reading and researching. Identrics’ big data lake encompasses over 130 000 websites and we index over 1.5 million documents per day

The platform scales to a global online data monitoring operation and aggregates data from news sites, blogs, forums, company news, government websites and regulators. Through Kaspian, Identrics offers a solution for a quick and easy setup process, full article access, and global monitoring.

The technology behind Kaspian

The process behind the work of Kaspian consists of 3 steps:

  1. Data Layer. Creating data streams through search strings, source metadata and article enrichments to quickly get relevant data. 
  2. Enrichment Layer. It chooses how to enrich the data layer with Document Topics, Named Entities, Document Sentiment, Entity Sentiment, Document Summary, and Facebook Engagement. 
  3. Provision. Managing the data streams and sending the content via API or sFTP.

Why is it important to use data platforms such as Kaspian

“Everyone has access to a vast amount of information. But this information has to be turned into knowledge, and it has to reach the right people at the right time.

The essence of what we, at Identrics, do is to make sense of the information we process.”

Vasil Shivachev, COO @Identrics

Vasil Shivachev, COO at Identrics, recently made a presentation on the importance of advanced technology and its benefits at a conference organised by Identrics, named “Technologies Against Disinformation”. He talked about different solutions to handling the big data in the digital world such as Kaspian and Topify.

Identrics uses big data to train models. We then enrich the data to extract more value and connect knowledge.

While the 3 V’s (Volume, Velocity, Variety) will tell you whether you are dealing with “big data” or not, you may or may not need a big data solution depending on your goals. Following are scenarios where big data solutions are inherently more suitable:

In case:

  • you are dealing with huge semi-structured or unstructured data from multiple sources.
  • you need to analyze all of your data and can not work with sampling them.
  • the process is iterative in nature.

Global data analytics trends

Did you know that over 3.7 billion people around the globe use the internet actively? That’s more than 49% of the world’s population!

It is expected that the data in the global datasphere will reach 175 zettabytes by 2025. Or, in other words, 175 trillion gigabytes. That’s definitely a lot of zeros. And it is not hard to imagine, since the number of bytes in the digital universe is 40 times bigger than the number of stars in the observable universe.

These numbers might sound striking but once an individual starts to imagine the data he produces, he could imagine the amount of collectively generated data on a daily basis.

The importance of data streams

After seeing the numbers above, it is time for you to get to know the importance of data streams and why it is crucial for all companies to receive good quality datasets. 

The highest priority for each company is to have a data delivery provider who can deliver datasets of good quality and coverage. It is essential not to miss any valuable material. Furthermore, every company needs to receive the content in the specific format they are using, and this could be a challenge for some data delivery companies.

Our data delivery tool integrates high-quality data streams directly into your operations and classifies material into purposeful knowledge. Contact us to learn more about our data delivery platform.