Data mining is the process of discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems. This is what wikipedia says to Data Mining. To understand what is behind this description we need to make a project and collect  data to discover paterns.

We have been inspected by D. Kriesel to browse a larger website that provides a lot of data. D. Kriesel has studied a great news online site and has drawn fascinating results from the analysis. Pease see also http://www.dkriesel.com/spiegelmining (german)

So we need a site with large content and and a large database to store all the data in.  And how do we collect the data. Do we create a crawler and store all pages to have the hole content or do we try to get the data by API.

API example:

https://www.youtube.com/watch?v=6jNWl5d_DOk

We have found a website and work on this task to add the daa to the database. Do you want to work with us?

To compare the state of our knowledge with a person's life we are crawling children who are interested in pulling up at the table to go to things on the table.