In the future, the world will be run by artificial intelligence. Artificial intelligence is the fastest growing field in the tech world. It has been predicted that by 2035, AI will be a trillion dollar industry. This is a big deal because AI is going to be used by every industry, from Apple to healthcare. The tech world is witnessing a boom in the AI market with a large number of companies investing in AI. Some of the bigwigs in the tech world include Google’s DeepMind and Tesla’s Elon Musk.
Elon Musk is a man who has a lot on his plate. He is CEO and product architect of Tesla, co-founder of PayPal, founder of SpaceX, and founder of Neuralink. He is also the chairman of SolarCity, co-chairman of OpenAI. All those companies are the most innovative in their domain.
OpenAI is the biggest non-profit in the AI business. The company was founded in 2015 by Elon Musk, Sam Altman, Peter Thiel, and Greg Brockman. The company is dedicated to making the world a better place through the development of safe artificial intelligence. In the past, the company has made a lot of progress in improving its autonomous driving technology.
Generative Pre-trained Transformer 3 (GPT-3) is a deep learning algorithm implemented by the OpenAI research team. The purpose of this algorithm is to handle text from user input : it process natural language in wide range of application such as:
OpenAI has released an API to use their algorithm. Currently lots of applications are running through the OpenAI API calls, and OpenAI is generating an average of 4.5 billion words per day, and continues to scale production traffic.
If you want more examples you can visit this website: https://beta.openai.com/examples
Or this one: https://gpt3demo.com/
GPT-3's full version has a capacity of 175 billion machine learning parameters. GPT-3 has been trained on 2 TB of internet text data (its previous version has been trained on 40 GB). The cost of AI is increasing exponentially. Training GPT-3 would cost over $4.6M using a Tesla V100 cloud instance.
The Core i7 5930K (Haswell E) has 289 gigaflops (so around 24x10-3 petaflop/s-days). And to train GTP-3 on this massive dataset, it requires several thousand petaflop/s-days (1020 operations per day). So approximately it would require millions of basic computers to do the same job.
Here a bunch of videos with some examples and applications which are generating text with GPT-3.
This algorithm matters for a few points and perspectives.
Firstly, it was a paving stone in the pond, I mean when GPT-3 was released in 2020, one of the biggest Deep Learning algorithms in natural language processing was around 340 million parameters. And GPT-3 is (for the biggest model) 175 billion parameters, so the computational limit always increases more and more! All of this is possible thanks to teams of researchers that make hardware more performant. Now, Google has released the “Switch Transformer” model which has more than 1 trillion parameters (this has been possible by optimizing the code and finding new mathematical functions to reduce the computational cost).
Secondly, in Deep Learning, algorithms and models are used to handle one task. So you needed one algorithm for translation, another for text generation, and so on. Now, with this kind of algorithm, you can have one model which is multitask! Moreover, even during training you can specify on what task you want your algorithm to train on.
As a disclaimer I would say technology is not the issue, humankind is the problem. For example, with relativity you can power electricity or annihilate cities. With pounder you can accelerate some construction or create bullets... And so on... It is exactly the same for artificial intelligence, you can enhance humankind by assisting doctors, workers or you can create a massive surveillance system... In the case of GPT-3, it can become your personal assistant for example.
Moreover, we are facing another issue, the internet is made of proper content and bad content, so by training this algorithm on the whole internet, GPT-3 can recreate some humankind mistakes even if some filters are set up.
We have seen GPT-3 is very powerful and can process text data through multiple applications. I would add this technology has been released for the first time in 2018, we are only at the beginning, beautiful things remain to be discovered and enhanced.
I really thank you for taking the time to read this article. If you have questions or remarks, feel free to contact me directly.
Next time, if you are interested, I can introduce you the Google AI company: Deepmind and their batches of AI’s use cases.
For your information, some part of this article has partially been written thanks to the GPT-3 algorithm. ;)
https://techwithtech.com/how-many-flops-computer/ https://medium.com/predict/what-you-need-to-know-about-gpt-3-and-why-it-matters-4878215 b78e8
Improve your content generation now.