TensorFlow gets a new release.

Here’s a great round up of the new features and improvements.

Google just release TensorFlow 2.2.0 with many news features and improvements, the new Profiler for TensorFlow 2 for CPU/GPU/TPU. TensorFlow 2.2.0 is dropping the support for Python 2, which is already passed end of life in January 2020. There are many other features and improvements with this version of […]

TensorFlow.js is a library for developing and training machine learning models in JavaScript and deploying them in a browser or on Node.js.

It is an open source, hardware-accelerated JavaScript library for training and deploying machine learning models.

Amazing to see the innovation of TensorFlow combined with the reach of more developers.

In recent times, a lot of attention is being paid to artificial intelligence (AI) and machine learning (ML). And in this context, the two most popular technologies are the Python and R environments or even C++ libraries. One of the most popular frameworks among developers is TensorFlow, which was developed by Google in 2011. Most of TensorFlow was designed in C++ and has bindings to Python or Java or R, but the most crucial language is missing, which is JavaScript.

deeplizard teaches us how to normalize a dataset. We’ll see how dataset normalization is carried out in code, and we’ll see how normalization affects the neural network training process.

Content index:

  • 0:00 Video Intro
  • 0:52 Feature Scaling
  • 2:19 Normalization Example
  • 5:26 What Is Standardization
  • 8:13 Normalizing Color Channels
  • 9:25 Code: Normalize a Dataset
  • 19:40 Training With Normalized Data

From navigating to a new place to picking out new music, AI backed algorithms have laid the foundation for much of modern life.

In this article, get a higher-level view of Google’s TensorFlow deep learning framework, with the ultimate goal of helping you to understand and build your own deep learning algorithms from scratch.

Over the past couple of decades, deep learning has evolved rapidly, leading to massive disruption in a range of industries and organizations. The term was coined in 1943 when Warren McCulloch and Walter Pitts created a computer model based on neural networks of a human brain, creating the first artificial neural networks (or ANNs). Deep learning now denotes a branch of machine learning that deploys data-centric algorithms in real-time.

Google introduced Tensor Processing Units or TPUs in four years ago.

TPUs, unlike GPUs, were custom-designed to deal with operations such as matrix multiplications in neural network training.

Here’s a great beginner’s guide to the technology.

Google TPUs can be accessed in two forms — cloud TPU and edge TPU. Cloud TPUs can be accessed from Google Colab notebook, which provides users with TPU pods that sit on Google’s data centres. Whereas, edge TPU is a custom-built development kit that can be used to build specific applications. In the next section, we will see the working of TPUs and its key components.