How to build a live data science projects

Hi All,

M been learning from last 7-8 months on this platform, and have found very interesting things,

But I think there is something which is quite not visible in my 80% of course completion,

Haven’t found a single project which has to be built live running and hosted somewhere,

Actually each and every project we have done it seems to be static, like we have data than we have used some tools and skills like pandas, sql, magic functions, matplotlib, etc.

But all this project were like one time thing, if we have same scenario, we will not be using jupyter notebook, it seems to be one time research work.

So, coming to the point wanted to know, how to build a live application with python skills, to host project online, where data is continuously fetched using some api’s than some analytical charts are being shown with observations on figures, somewhat like this,

Requesting you to all the readers,
if anyone have similar thought or idear or a project sample, like m wondering,
Please share it me

I really want to see and want to build one, but m looking for some reference on this part

Please let me know if any further query on this, I have asked this question out of curiosity only,

Thank You,
Vinod :slight_smile:

6 Likes

Hi, @vinodgchandaliya .

In fact, some of the projects that you have done may well reflect the real tasks. A simple example of monthly reporting in a company.

In this case, your script would also extract the ready data from the database itself, build tables and charts. It would show some ratios and at the end would simply form a PDF file. This is partly similar to the managed projects you have encountered).

But I understand your desire to try something more serious and active. I don’t have a ready project for you, but I may be able to give you a direction.

1. Source

Let’s use Twitter. This is generally a very popular source for pet projects in Data Sciense.
Short messages. API that is easy to access. Many users.

link to API documentation - https://developer.twitter.com/en/docs/twitter-api/v1/trends/trends-for-location/api-reference/get-trends-place.

2. Research task.

Let’s investigate how long tags are in trend and their dynamics.
In this way we need the same data.

  1. a tag that appeared in a trend.
  2. Number of messages per unit of time for a tag - Dynamically updated data
  3. Additionally. The number of messages per time unit by tag for the last 7-30 days
    Determine for yourself how much more to track the tag data after it disappears from trends

Here you should define for yourself the database in which you will store the data and the structure of the tables.
Here, you write scripts to collect data.

3. Defining the format for presenting the result to users.

  1. These may be tables that show the current trends and the average number of messages per trend.
  2. Trend life cycle charts

4. How can I expand?

  1. you can create a model that evaluates the emotional color of messages and displays with what emotional color on average now write about the trend
  2. you can create a model to predict that some tag will become a trend one

5. Server.
I would recommend DigitalOcean. For 5$ per month you get a server where you can try out various projects. You will work live with the command line while you deploy it.

You can use Flask as a web framework for your application and follow the articles - https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world.

They tell you in detail how to write and deploy a web application.

6 Likes

Well written. Thanks for the insight @moriturus7

1 Like

@moriturus7
Thank you for the information,
I am seriously looking for an idea and direction to have a real-time live data science project,
You have not only shared information, you had also equip me with idea + information + tools + direction to have one project in hand,
Thank you so much :slight_smile: :innocent:

1 Like