In this challenge, we will treat electroencephalogram data and try to predict if a fixation is used for control or if it is a spontaneous one.
In this challenge, we are going to predict the final price of each house. We are given 2 data sets: train and test. Each data set contain many features that we will explore, try to find eventual correlations beween them and select the most useful ones to predict the house price. Finally, we will test and compare a few models trained on this data in order to select the model with the best performance.
In this challenge, we are going to predict the final price of each house. We are given 2 data sets: train and test. Each data set contain many features that we will explore, try to find eventual correlations beween them and select the most useful ones to predict the house price. Finally, we will test and compare a few models trained on this data in order to select the model with the best performance.
In this post, we will build, train and optimize in TensorFlow one of the simplest Convolutional Neural Networks, LeNet-5, proposed by Yann LeCun, Leon Bottou, Yosuha Bengio and Patrick Haffner in 1998.
In this notebook, I'll try to implement the gradient descent algorithm, test it with few predefined functions and visualize its behabiour in order to coclude with the importance of each parameter of the algorithm. At the end, I will apply the gradient descent algorithm to minimize the mean squared error funcion of the least squares method.
I downloaded my data from Facebook in a .json format. I used Python with Jupyter Notebook to play with data. The json and pandas libraries are very useful to read and display data in a stylized way. I added some columns to the data such as date (since the available time was a millisecond timestamp) and total characters for each row (each row represents a sent message). Finally, I exported the data as an .xls file and opened it with Tableau to make the graphs.