Can intelligence emerge simply by training a big enough language model using lots of data? OpenAI tries to do so, using 175 billion parameters.

Photo by geralt via pixabay

A few days ago, OpenAI announced a new successor to their Language Model (LM) — GPT-3. This is the largest model trained so far, with 175 billion parameters. While training this large model has its merits, reading a large portion of 72 pages can be tiresome. In this blog post I’ll highlight the parts that I find interesting for people familiar with LMs, who merely wish to know (most of) the important points of this work.

What’s in a Language Model?

“The diversity of tasks the model is able to perform in a zero-shot setting suggests that high-capacity models trained to maximize the likelihood of…

Text generation using GPT-2 is quite easy, using the right tools. Learn how to do it, as well as how to fine-tune the model on your own dataset.

Natural Language Generation (NLG) is a well studied subject among the NLP community. With the rise of deep learning methods, NLG has become better and better. Recently, OpenAI has pushed the limits, with the release of GPT-2 — a Transformers based model that predicts the next token at each time space.

Nowadays it’s quite easy to use these models — you don’t need to implement the code yourself, or train the models using expensive resources. HuggingFace, for instance, has released an API that eases the access to the pre-trained GPT-2 OpenAI has published. …

An unsupervised approach to digit classification and generation

The Variational Autoencoder (VAE) is a paragon for neural networks that try to learn the shape of the input space. Once trained, the model can be used to generate new samples from the input space.

If we have labels for our input data, it’s also possible to condition the generation process on the label. In the MNIST case, it means we can specify which digit we want to generate an image for.

Let’s take it one step further… Could we condition the generation process on the digit without using labels at all? …

How to structure your TensorFlow graph like a software engineer

So you’ve finished training your model, and it’s time to get some insights as to what it has learned. You decide which tensor should be interesting, and go look for it in your code — to find out what its name is. Then it hits you — you forgot to give it a name. You also forgot to wrap the logical code block with a named scope. It means you’ll have a hard time getting a reference to the tensor. It holds for python scripts as well as TensorBoard:

Can you see that small red circle lost in the sea…

How to apply your model to input it has never seen before

Some of the problems we tackle using machine learning involve categorical features that represent real world objects, such as words, items and categories. So what happens when at inference time we get new object values that have never been seen before? How can we prepare ourselves in advance so we can still make sense out of the input?

Unseen values, also called OOV (Out of Vocabulary) values, must be handled properly. Different algorithms have different methods to deal with OOV values. Different assumptions on the categorical features should be treated differently as well.

In this post, I’ll focus on the…

Learn how node2vec works, and what kind of information it captures that word2vec doesn’t — includes case study

In the last couple of years deep learning (DL) has become a main enabler for applications in many domains such as vision, NLP, audio, click stream data etc. Recently researchers started to successfully apply deep learning methods to graph datasets in domains like social networks, recommender systems and biology, where data is inherently structured in a graphical way.

So how do Graph Neural Networks work? Why do we need them?

The Premise of Deep Learning

In machine learning tasks involving graphical data, we usually want to describe each node in the graph in a way that allows us to feed it into some machine learning…

Personal branding is a thing now. It always has been, but I believe it’s been getting more and more attention recently. More people are aware of its importance, including the employers. Giving you a big paycheck, assuming you’re good, is obvious. Providing opportunities to flourish and build your personal brand is something an increasing number of companies are trying to seduce you with.

While working in the algorithms group at Taboola, I was encouraged by the company to share my knowledge with the data science community. …

Splitting your dataset to train-test sets can sometimes be more complicated than one might expect

About a year ago we incorporated a new type of feature into one of our models used for recommending content items to our users. I’m talking about the thumbnail of the content item:

Tensorflow is great. Really, I mean it. The problem is it’s great up to a point. Sometimes you want to do very simple things, but tensorflow is giving you a hard time. The motivation I had behind writing TFFS (TensorFlow File System) can be shared by anyone who has used tensorflow, including you.

All I wanted was to know what the name of a specific tensor is; or what its input tensors are (ignoring operations).

All of these questions can be easily answered using tensorboard.
Sure, you just open the graph tab, and visually examine the graph. Really convenient, right? Well…

In the previous post of this series I introduced the Variational Autoencoder (VAE) framework, and explained the theory behind it.

In this post I’ll explain the VAE in more detail, or in other words — I’ll provide some code :)

After reading this post, you’ll understand the technical details needed to implement VAE.

As a bonus point, I’ll show you how by imposing a special role to some of the latent vector’s dimensions, the model can generate images conditioned on the digit type.

The model will be trained on MNIST — handwritten digits dataset. …

Yoel Zeldes

Algorithm Engineer @ AI21 Labs

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store