Tag Archives: generative ai
OpenAI Python API Example for NLP Tasks

Ever wondered how you can leverage the power of OpenAI’s GPT-3 and GPT-3.5 (from Jan 2024 onwards) directly in your Python application? Are you curious about generating human-like text with just a few lines of code? This blog post will walk you through an example Python code snippet that utilizes OpenAI’s Python API for different NLP tasks such as text generation. Check out my other post on how to use Langchain framework for text generation using OpenAI GPT models. OpenAI Python APIs The OpenAI Python API is an interface that allows you to interact with OpenAI’s language models, including their GPT-3 model. The following are different popular models that you …
Architecting a Generative AI Platform for GPT-based LLM Apps

Have you ever wondered how to build a scalable Generative AI platform based on OpenAI GPT models that can serve different applications? Are you a data scientist, product manager, or software engineer looking to understand the intricacies of the architecture of such a scalable generative AI platform? This blog aims to demystify the architectural building blocks needed to create a robust GPT-based platform. By the end, you will have a clear roadmap for architecting, designing, and implementing your own GPT-based large language models (LLMs) applications platform. Generative AI Platform Architecture for GPT-based LLM Apps The following is the technology architecture of generative AI platform which can leverage OpenAI GPT based …
Encoder Only Transformer Models Quiz / Q&A

Are you intrigued by the revolutionary world of transformer architectures? Have you ever wondered how encoder-only transformer models like BERT, ELECTRA, or DeBERTa have reshaped the landscape of Natural Language Processing (NLP)? The rapid advancement of machine learning has led to the creation of numerous transformer architectures, each with unique features, applications, and underlying mechanics. Whether you’re a data scientist, machine learning engineer, generative AI enthusiast, or a student eager to deepen your understanding, this quiz offers an engaging and informative way to assess your knowledge and sharpen your skills. It would also help you prepare for your interviews on this topic. Encoder-only transformer models have become a cornerstone in …
OpenAI GPT-3 Models List: Explained with Examples

In the ever-evolving landscape of natural language processing (NLP), OpenAI’s GPT-3 models have garnered significant attention for how they could understand and generate human-like text. Different GPT-3 models discussed in this blog can be accessed using APIs and OpenAI Playground. In this blog post, we will delve into the OpenAI GPT-3 models and provide a comprehensive list, along with explanations and examples of their capabilities. Although GPT-3.5 models are more powerful than their counterpart GPT-3 models, it is only these GPT-3 models which are currently available for fine-tuning. Whether you are an experienced data scientist or a curious generative ai enthusiast, understanding these models is crucial in making the most …
LLM Chain OpenAI Python Example

Have you ever wondered how to fully utilize large language models (LLMs) in our natural language processing (NLP) applications, like we do with ChatGPT? Would you not want to create an application such as ChatGPT where you write some prompt and it gives you back output such as text generation or summarization. While learning to make a direct API call to an OpenAI LLMs is a great start, we can build full fledged applications serving our end user needs. And, building prompts that adapt to user input dynamically is one of the most important aspect of an LLM app. That’s where LangChain, a powerful framework, comes in. In this blog, …
Langchain ChatGPT Hello World Python Example

Have you ever wondered how to build applications that not only utilize large language models (LLMs) but are also capable of interacting with their environment and connecting to other data sources. If so, then LangChain is the answer! In this blog, we will learn about what is LangChain, what are its key aspects, how does it work. We will also quickly review the concepts of prompt, tokens and temperature when using the OpenAI API. We will the learn about creating a ‘Hello World’ Python program using LangChain and OpenAI’s Large Language Models (LLMs) such as GPT-3 models. What is LangChain Framework? LangChain is a dynamic framework specifically designed for the …
Encoder-only Transformer Models: Examples

How can machines accurately classify text into categories? What enables them to recognize specific entities like names, locations, or dates within a sea of words? How is it possible for a computer to comprehend and respond to complex human questions? These remarkable capabilities are now a reality, thanks to encoder-only transformer architectures like BERT. From text classification and Named Entity Recognition (NER) to question answering and more, these models have revolutionized the way we interact with and process language. In the realm of AI and machine learning, encoder-only transformer models like BERT, DistilBERT, RoBERTa, and others have emerged as game-changing innovations. These models not only facilitate a deeper understanding of …
LLMs & Semantic Search Course by Andrew NG, Cohere & Partners

Andrew Ng, a renowned name in the world of deep learning and AI, has joined forces with Cohere, a pioneer in natural language processing technologies. Alongside him are Jay Alammar, a well-known educator and visualizer of machine learning concepts, and Serrano Academy, an esteemed institution dedicated to AI research and education. Together, they have launched an insightful course titled “Large Language Models with Semantic Search.” This collaboration represents a fusion of expertise aimed at addressing the growing needs of semantic search in various applications. In an era where keyword search has dominated the search landscape, the need for more sophisticated, content-aware search capabilities is becoming increasingly evident. Content-rich platforms like …
Transformer Architecture Types: Explained with Examples

Are you fascinated by the power of deep learning models that can translate languages, generate creative writing, and even answer complex questions? Ever wondered how a machine can understand and process human language with such finesse? At the heart of these remarkable achievements lies a machine learning model architecture that has revolutionized the field of Natural Language Processing (NLP) – the Transformer architecture, a deep learning architecture. But what makes Transformer models so special? How do they manage to encode the subtle nuances of language and context? Can we understand the complex mathematical machinery that operates behind the scenes? Whether you’re a seasoned data scientist, an aspiring machine learning engineer, …
Quiz: BERT & GPT Transformer Models Q&A

Are you fascinated by the world of natural language processing and the cutting-edge generative AI models that have revolutionized the way machines understand human language? Two such large language models (LLMs), BERT and GPT, stand as pillars in the field, each with unique architectures and capabilities. But how well do you know these models? In this quiz blog, we will challenge your knowledge and understanding of these two groundbreaking technologies. Before you dive into the quiz, let’s explore an overview of BERT and GPT. BERT (Bidirectional Encoder Representations from Transformers) BERT is known for its bidirectional processing of text, allowing it to capture context from both sides of a word …
Transfer Learning vs Fine Tuning: Differences

Generative AI is revolutionizing various domains, from natural language processing to image recognition. Two concepts that are fundamental to these advancements are Transfer Learning and Fine Tuning. Despite their interconnected nature, they are distinct methodologies that serve unique purposes when training large language models (LLMs) to achieve different objectives. In this blog, we will explore the differences between Transfer Learning and Fine Tuning, learning about their individual characteristics and how they come into play in real-world scenarios with the help of examples. What is Transfer Learning? Transfer Learning is an AI / ML concept that refers to the utilization of a pre-trained model on a new but related task. It …
Pre-training vs Fine-tuning in LLM: Examples

Are you intrigued by the inner workings of large language models (LLMs) like BERT and GPT series models? Ever wondered how these models manage to understand human language with such precision? What are the critical stages that transform them from simple neural networks into powerful tools capable of text prediction, sentiment analysis, and more? The answer lies in two vital phases: pre-training and fine-tuning. These stages not only make language models adaptable to various tasks but also bring them closer to understanding language the way humans do. In this blog, we’ll dive into the fascinating journey of pre-training and fine-tuning in LLMs, complete with real-world examples. Whether you are a …
Top 5 Books on Generative AI: New Releases on Amazon

Are you fascinated by the potential of generative artificial intelligence (AI)? Are you looking for the latest insights and knowledge in the field of AI and its creative applications? Look no further! In this blog post, we’ll introduce you to the top 5 books on generative AI that have been making waves on Amazon in the last 90 days. These books delve into various aspects of generative AI, offering readers a comprehensive understanding of its implications, applications, and transformative power. 1. The Artificial Intelligence and Generative AI Bible: [5 in 1] The Most Updated and Complete Guide Author: Alger Fraley Rating: 4.4 Step into the world of generative AI with …
BERT vs GPT Models: Differences, Examples

Are you intrigued by the world of natural language processing (NLP) and the cutting-edge machine learning models that power it? Have you ever wondered what sets apart two of the most prominent models in the field, Bidirectional Encoder Representations from Transformers (BERT) and Generative Pre-trained Transformer (GPT)? These models have revolutionized the way machines understand and generate human language, but what exactly differentiates them? In this blog, we will delve into the core architecture, training objectives, real-world applications, examples and more. By exploring these aspects, we’ll learn about the unique strengths and use cases of both models, providing you with insights that can guide your next project or research endeavor. …
Retrieval Augmented Generation (RAG) & LLM: Examples

Have you ever wondered how to seamlessly integrate the vast knowledge of Large Language Models (LLMs) with the specificity of domain specific knowledge or external databases? As the world of machine learning continues to evolve, the need for more sophisticated and contextually relevant responses from models becomes paramount. For data scientists and product managers keen on deploying LLMs in production, the Retrieval Augmented Generation (RAG) pattern offers a compelling solution. In this blog, we’ll dive deep into the RAG pattern, illustrating its power and potential with practical examples. Whether you’re aiming to enhance your product’s AI capabilities or simply curious about the next big thing in machine learning, this exploration …
GPT Models In-context Learning: Examples

Have you ever wondered how AI models like OpenAI GPT-3 (Generative Pretrained Transformers-3) can generate impressively human-like text? Enter the realm of in-context learning that gives GPT-3 its conversational abilities and makes it extraordinary. In this blog, we’re going to learn the concepts of in-context learning, its different forms, and how GPT-3 uses it to revolutionize the way we interact with AI. What’s In-context Learning? In-context learning is at the heart of these large language models (LLMs), enabling GPT models to understand/comprehend and create text that closely resembles human speech, based on the instructions and examples they’re provided. As the model learns about the context based on the examples provided …
You can use citation styles as appropriate. Thank you Kumar, Ajitesh. "Two independent samples t-tests: Formula & Examples." Vitalflux.com, 22…