This API was created for experimenting and portfolio only, checkout the INSTRUCTIONS.md
There are some features who still need to be implemented: you can check it out here
The application is a conjuction between an Interface to serve a Twitter Mining Job which will return a dataset limited by a 100 tweets filtered by "bieber" keyword. future parameterization will be implemented here
You can manipulate and access the mining job through the diverse routes. This API is powered by Swagger With the application running you can access it (for example on localhost/swagger/)
Please read the following instructions carefully and make sure that you fulfill all the requirements listed.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
-
Create on the project root directory your
.env
file according to thetemplate.env
-
Run the command below on the project root directory :
docker-compose up
What things you need to install the software and how to install them
- You'll need Python3 and
pip install
command; - You'll need to be capable of running Bjoern
- obs: Checkout first the installation method and requirements!
- Also you will need a MongoDB to persist
- Run the command below:
pip install -r requirements.txt
- Start a
Mongod
service or change the MONGODB variables
sudo service mongod start
- You can check it using the command
mongo
to interact with the database
Fulfilling the Installing steps
- Start the API from the project root directory
python api/app_server.py
obs: remember to put the IP address before the route (e.x. http://localhost:5000/healthcheck) )
-
/healthcheck
- Route to check if the API is online.
-
/trigger
- Route to trigger the Twitter Mining Job.
-
/status
- Route to check the status of the Twitter Mining Job
-
/stop
- Route to stop the Twitter Mining Job
-
/download
- Route to download the output.tsv created from the Job.
There's also a folder called notebooks
where you can use Jupyter Notebook and see the step-by-step of the Twitter Mining Job.
- Python 3 - Main programming language used for this application
- Bjoern - used Web Server to contain the Flask Application
- Swagger - Live Documentation framework
- Docker - "Enterprise Container Platform for High-Velocity Innovation"
- Jupyter Notebook - "An open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text."
- MongoDB - "A cross-platform document-oriented database program."
- Lucian Lorens - Lorensov
- Gratitude for my family and friends who always are there supporting and caring;
- Great thanks to the awesome team ".zip Team" working with me;
- And Helio who was there on the most difficult moments cheering me up.