Here’s a great article on three techniques for pre-processing raw text input for use in text classification/natural language processing applications.

Modern neural networks cannot interpret labeled text as described above and data must be pre-processed before it can be given to a network for training. One straightforward way to do this is with a bag of words. A bag of words is created by scanning through every element in a data set and creating a dictionary for each unique word seen that can act as an index.

O’Reilly is holding its first-ever TensorFlow World event, presented with TensorFlow at the Santa Clara Convention Center in Santa Clara, California, October 28–31.

They’ve just announced the speakers and it looks awesome.

Are you going? If so, let us know in the comments.

At TensorFlow World, attendees will see TensorFlow 2.0 in action, discover new ways to use it, and learn how to successfully implement it in their businesses. The event offers a great opportunity to explore the entire machine learning stack and learn about available tools, profitable use cases, and successful implementations from companies like Spotify, Twitter, Amazon, and more.

Speaking neural networks, here’s a live recording of my talk from Azure Data Fest Fall 2019 in Reston

In this session, I explain Neural Networks from the Ground Up

Neural networks are an essential element of many advanced artificial intelligence (AI) solutions. However, few people understand the core mathematical or structural underpinnings of this concept.

In this session, learn the basic structure of neural networks and how to build out a simple neural network from scratch with Python.

This episode was recorded live at the Azure Data Fest in Reston, VA on Oct 11, 2019.You can watch the entire live stream here: http://franksworld.com/2019/10/11/azure-data-fest-reston-live-stream/

Press the play button below to listen here or visit the show page at DataDriven.tv.

Here’s a great tutorial on using Keras to create a digit recognizer using the classic MNIST set.

An artificial neural network is a mathematical model that converts a set of inputs to a set of outputs through a number of hidden layers. An ANN works with hidden layers, each of which is a transient form associated with a probability. In a typical neural network, each node of a layer takes all nodes of the previous layer as input. A model may have one or more hidden layers.

ExplainingComputers explores Edge computing definitions and concepts.

This non-technical video focuses on edge computing and cloud computing, as well as edge computing and the deployment of vision recognition and other AI applications.

Also introduced are mesh networks, SBC (single board computer) edge hardware, and fog computing.