Free data science course Paid Members Public
Join my free data science and machine learning email course with Python. Each day I will share a new lesson and provide code examples and notebooks to help you in your data science journey. The course covers beginner to advanced concepts in data science and machine learning. I will provide
Object detection with Vision Transformer for Open-World Localization(OWL-ViT) Paid Members Public
Convolutional neural networks have been the primary networks applied in objection detection. Recently, Transformers have gained popularity in natural language processing and computer vision. In this article, we explore the use of the OWL-ViT in object detection. Let’s get started. What is a Vision Transformer? Transformers have been widely
Train ResNet in Flax from scratch(Distributed ResNet training) Paid Members Public
Apart from designing custom CNN architectures, you can use architectures that have already been built. ResNet is one such popular architecture. In most cases, you'll achieve better performance by using such architectures. In this article, you will learn how to perform distributed training of a ResNet model in
Handling state in JAX & Flax (BatchNorm and DropOut layers) Paid Members Public
Jitting functions in Flax makes them faster but requires that the functions have no side effects. The fact that jitted functions can't have side effects introduces a challenge when dealing with stateful items such as model parameters and stateful layers such as batch normalization layers. In this article,
Transfer learning with JAX & Flax Paid Members Public
Training large neural networks can take days or weeks. Once these networks are trained, you can take advantage of their weights and apply them to new tasks– transfer learning. As a result, you fine-tune a new network and get good results in a short period. Let's look at
Flax vs. TensorFlow Paid Members Public
Flax is the neural network library for JAX. TensorFlow is a deep learning library with a large ecosystem of tools and resources. Flax and TensorFlow are similar but different in some ways. For instance, both Flax and TensorFlow can run on XLA. Let's look at the differences between
Activation functions in JAX and Flax Paid Members Public
Activation functions are applied in neural networks to ensure that the network outputs the desired result. The activations functions cap the output within a specific range. For instance, when solving a binary classification problem, the outcome should be a number between 0 and 1. This indicates the probability of an
Optimizers in JAX and Flax Paid Members Public
Optimizers are applied when training neural networks to reduce the error between the true and predicted values. This optimization is done via gradient descent. Gradient descent adjusts errors in the network through a cost function. In JAX, optimizers are applied from the Optax library. Optimizers can be classified into two