SELENIUM is a free and open-source automated testing framework used to validate web applications across different browsers and platforms but is also used as a scraping tool like scrapy and beautiful soup.
In this article, we will be looking at how we can use Selenium to scrape all the Location Data from Google Maps using the URL of the location.
So without further adieu let’s get started…
Quick Note: Any data collected from websites can be subject to copyright, thus meaning that should not be reused without owner consent and that should not be definitely used for commercial purposes. …
Heroku is a platform as a service based on a managed container system, with integrated data services and a powerful ecosystem, for deploying and running modern apps. The Heroku developer experience is an app-centric approach for software delivery, integrated with today’s most popular developer tools and workflows.
In this article, we will be looking at how we can deploy any Flask app having a Database attached to it on Heroku.
The following article is explained irrespective of your codebase, as to deploy a Flask Application with a Database on Heroku all we have to modify is the app.config[‘SQLALCHEMY_DATABASE_URI’]…
Flask-SQLAlchemy is an extension for Flask that adds support for SQLAlchemy to your application. It aims to simplify using SQLAlchemy with Flask by providing useful defaults and extra helpers that make it easier to accomplish common tasks.
In this article, we will be making a very basic Flask-App that accepts the User Details like Name and Email ID from the user, stores that data into the SQLAlchemy Database, and then fetch and displays all the Database data into an HTML Table.
pip install flask
pip install Flask-SQLAlchemy
Does it ever happen to you that you have created an awesome FLASK Project and want to share it with your friends and colleagues but don’t know-how?
Well not anymore…
In this article, I will show you two methods using which you can convert your FLASK Project into a shareable cross-platform app.
This method converts your entire FLASK project into a platform-independent single executable script such that you can share your project without needing to worry that the receiver has all the dependent libraries installed in their system or not.
Aspect Based Sentiment Analysis also known as Feature Based Sentiment Analysis is a technique to find out various features, attributes, or aspects from a given text and their respective sentiments.
In this article, we will look at how we can implement ABSA using Python and various NLP tools such as StanfordNLP and NLTK.
The paper that I have referenced for the implementation of the code is published by Nachiappan Chockalingam in which he had explained everything in very detail about ABSA. …
The database is a collection of data or information, that is specially organized for rapid search and retrieval by a computer. Databases are structured to facilitate the storage, retrieval, modification, and deletion of data in conjunction with various data-processing operations.
In this article, We’ll see how we can store the scraped data using Scrapy from Amazon in an MYSQL Database such that we have don't have to scrape the same data again and again if we have already scraped that data before.
This will be a follow-up article on my previous article where I have explained how to build a…
Flask is an API of Python that allows us to build up web-applications. Flask’s framework is more explicit than Django’s framework and is also easier to learn because it has less base code to implement a simple web-Application.
In this article, we’ll see how we can integrate the Scrapy for scraping any website with FLASK and build up a web form such that on a click of a button the entire scrapy code will be up and running and returns us the scraped data.
This article will be a follow up on my previous article where I have explained how…
Scrapy is a fast, open-source web crawling framework written in Python, used to extract the data from the web page with the help of selectors based on XPath.
In this article, We will be looking at how we can use Scrapy to scrape all the Amazon Product Reviews using just its URL and automatically store all the scraped data into a JSON file within seconds.
Quick Note: Items Scraped:
Setting Up Project
In this article, I have shown some analytics that you can perform on your business like how many people are completing a series of steps like Adding an item to cart and then buying it or which product is more popular in a particular state or people are most likely to buy if some offers or discounts are given to them.
For this Analytics Report, I had used the popular Brazilian E-Commerce Datasets which is available on Kaggle,
Importing the Essential Libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.dates …
Tech Blogger | Python/Java Developer | Data Science Enthusiast