Who has never heard about “Citizen development”? in this short post I am trying to counteract this trend by positioning above the current limits and also figuring out the next coming steps. Just ask yourself if business users really wants to develop their applications or processes? Is this way of doing things just a step towards something much more augmented?
We will see in this article, how with the YOLO neural network we can very simply detect several objects in a photo. The objective is not to go into the details of the implementation of this neural network (much more complex than a simple sequential CNN) but rather to show how to use the implementation which was carried out in C ++ and which is called Darknet.
NLPCloud.io is an API for easily using NLP in production. The API is based on the pre-trained models of spaCy and Hugging Face (based on transformers). In and article we will see how to use this API in a few lines …
In this tutorial, I invite you to discover a small open source framework that is very easy to set up and use and which will allow you to create an interface for your Machine Learning models. Follow the leader …
In this article we will discuss the concept of Transfer Learning … or how to avoid redoing long and consuming learning by partially reusing a pre-trained neural network. To do this we will use a network which is the reference in the matter: VGG-Net (vgg16).
I propose in this article to create a convolutional neural network to do NLP, and for the data I will use a dataset that you can simply find in the Kaggle datasets: FrenchFakeNewsDetector. You have understood the objective is twofold: on the one hand to see how we can use the convolution technique with vectors (1 dimension instead of images with 2+ dimensions) and on the other hand to do NLP with data in French.
When I finished the article on gradient descent, I realized that there were two important points missing. The first concerns the stochastic approach when we have too large data sets, the second being to see very concretely what happens when we poorly choose the value of the learning rate. I will therefore take advantage of this article to finally continue the previous article 😉
How to talk about Machine Learning or even Deep Learning without addressing the – famous – gradient descent? There are many articles on this subject of course, but often you have to read several in order to fully understand all the mechanisms. Often too mathematical or not enough, I will try especially here to explain its operation smoothly and step by step in order to try to demystify the subject.
For an analysis I wanted to do and after several searches, I realized that it was not that easy to get historical weather data. Of course, as i’m french I went to Meteo France Open Data and tried other open data site. But nothing really usable or it seems without a paid subscription. So I decided to retrieve them through a Python program and the scraping technique.