Tech Resources I love!

Preparing for technical interviews

Ace Every Stage of Your Next Technical Interview with these curated resources

Courses on Cloud, Data and AI

Step by step courses with hands-on experience and projects

What is GitHub Spark: The Full Demo Inside
Priyanka Vergadia Priyanka Vergadia

What is GitHub Spark: The Full Demo Inside

Vibe coding on steroids with GitHub Spark. 🚀 GitHub Spark Just Changed Everything About App Development GitHub has just launched Spark, an AI-powered coding platform that turns natural language descriptions into fully functional web applications. No coding required, no setup headaches, and one-click deployment to production. This isn't just another AI coding assistant – it's a complete paradigm shift in how we build software. ⚡ What You'll Learn: What GitHub Spark is and how it works Live demonstration of building multiple apps with just natural language Why this matters for developers, designers, and entrepreneurs Honest breakdown of pricing and limitations The future of AI-powered development 🔥 Key Highlights: Vibe Coding Full-stack applications generated from plain English Integrated with Claude Sonnet 3.5, GPT-4o, and other leading AI models One-click deployment with enterprise-grade hosting Complete GitHub ecosystem integration Real-time live previews and instant iteration 💰 Pricing & Access: Currently available in public preview for GitHub Copilot Pro+ subscribers ($39/month) Includes 375 Spark messages, unlimited manual editing, hosting, and AI inference 🛠️ Perfect For: ✅ Rapid prototyping and MVP development ✅ Internal tools and personal projects ✅ Learning full-stack development concepts ✅ Non-technical founders validating ideas ✅ Experienced developers eliminating boilerplate work

Read More
Complete Beginners Guide to Hugging Face
Priyanka Vergadia Priyanka Vergadia

Complete Beginners Guide to Hugging Face

hey everyone and welcome back to my

channel where we talk about cloud tech

and AI And today we're diving into a

platform that you must know about if

you're doing anything to do with AI It

is Hugging Face Now Hugging Face has

been called the GitHub of machine

learning and for all the great reasons

It is literally becoming the community

where AI models and creations are shared

across everybody So by the end of this

video you will understand what it is

exactly and why it matters to you even

if you are not somebody who codes every

day So stick around All right So this is

Demo

the hugging face homepage And the first

thing that you'll notice in here is

their tagline which is the AI community

building the future That really sums up

what they're really about right it's a

collaborative platform where people are

sharing AI tools models data sets and

even AI apps Now if you scroll down

you're able to see features models that

are trending and recently uploaded

content Um and this gives you the taste

of what's popular in the AI community

And before we dive deeper I would

recommend that you sign up and create an

account because you would need one Uh

most of the content you'll be able to

just see um without uh having an account

but if you want to use the models and

save your favorites and things like that

you will need an account Now um going

into the models tab this is where all

the models are found You can see that

they've got millions of models in here

and you can filter them by different

tasks in the categories like natural

language processing and classification

and audio and tabular

And you could also filter them by

libraries and data sets and languages

and

licenses Now here let's check out u

Microsoft's popular um 54 reasoning

model And I wanted to see how far I can

go with this So each model has got its

own page with documentation and I

clicked on deploy and it literally just

took me right away into the machine

learning studio in Azure AI and I was

asked to create a workspace and as soon

as I gave it all the details with the

name and everything um it was able to

create that workspace for me and

deployed that model the 54 model um from

hugging face into Azure AI machine

learning studio

That was absolutely amazing I just it

just took me a few clicks to do this And

you can see it's creating that now And

once it is created I can go to the

workspace And in this workspace if I

click on endpoints if I click on

endpoints I'm able to go into Azure

OpenAI service And that's where my 54

endpoint is And if I want to use this

endpoint I can click on continue And

that takes me into Azure AI Foundry

where I'll be able to um experiment with

this with this deployed model It tells

me my target URL the key that it created

for me I'm able to see how to use this

with my API key and um some samples of

how to use this model Um I am also able

to go play with it in the playground and

test it out So I gave it a prompt Um and

um I was really trying to go with like

the dog traveling uh to the mountains

where he meets a robot and um that robot

is helping a bird um survive in the cold

um and they all become friends for life

So um I was just playing around with

with um a prompt but the idea here is

that you're able to go from looking at a

model in hugging phase to actually able

to deploy that model in Azure AI found

uh foundry and uh Azure AI machine

learning studio Um and then I clicked on

deploy that um endpoint as a web

application

And right now that is what you're seeing

um with the Azure AI web application

being created um as a part of this

deployment It's able to deploy a web

application right from that um that

model and endpoint that we just created

Once the model this takes seconds maybe

a minute or so to to get created with

the deployment assets and stuff things

like that And once the app is deployed

I'm able to see that app in Azure AI

foundry in my web apps section There it

is the 54 experiment I click on that app

and there we have it An entire chat

application built from hugging face

choosing a model 54 reasoning goes into

Azure AI machine learning studio foundry

and builds it out for me as a web

application Going back into our hugging

Walkthrough

face interface we're able to uh let's

look at the the the data sets tab Now

this tab is where all your data sets are

There are thousands of these in there

and um you can preview the samples in

there You can also filter the data sets

of through languages tasks libraries all

of that Um and then if you click on one

you're able to actually see the samples

of that of the data set um and start

using them Um and now the next thing is

one of my favorites which is the spaces

section of hugging face Now this is

where things get really really exciting

especially for non-coders So spaces is

this interactive AI application that

anyone can use right within the hugging

face browser experience And think of

them as like readytouse AI tools I

clicked on one here which is called

describe anything um in uh by created by

Nvidia And when you go into that model

again right in the browser I'm not doing

anything else I can upload an image And

uh once I do that um I can type my

description and I can get my description

for for the regions of my images This is

my dog sitting on a chair in a park And

um I selected different parts of that

image And um this model is able to this

demo is able to tell me what are in

these different parts Um I selected the

tree first and then I selected my dog

himself and uh it was able to do a

really good job at telling me what is in

this image Um and if I wanted I can take

this space and deploy it for myself

whether locally or um or in cloud Um and

but before we do that let's look at

another example So I go back into my

spaces I can really um you know

categorize by image generation 3D

modeling all the different options up

top Um and I went into stable diffusion

which is another one of the very common

and very popular libraries in gener of

uh image generation models And um I

tested this one out right here in spaces

with a prompt serene lake at sunset with

mountains in the background and a golden

retriever watching the sunset I let it

generate the image And there we have it

Um I don't know if I like the first one

The second one's okay

Um but it it did what I wanted it to do

Um and let's say I'm happy with with

what it's I love the third and the

fourth images Um they really do what I

asked it to do The the good part the

best part the part that I want to show

you is I can run this space Let's say I

like it I can run it locally I can run

it um I can clone the repo um and um and

start working with it right from here

just like how we deployed the um the

five for model in Azure AI and uh with

that um let's look at the docs the docs

section is uh your knowledge center this

is where you are going to get deep

deeper technical information the docs

are organized by different categories

like the client libraries deployment

interface core ML libraries like the

transformers which is one of the very

famous libraries diffusers tokenizers um

and a lot more like radio Um and then

the next thing the last thing I want to

talk to you about is the community

section This is where people ask

questions and share ideas and learn The

blog part of the of the community is

amazing you'll see a lot of people

contributing to the blogs and you'll see

um what's happening right now um and and

what's hot right now Then the learn

section is one of my favorites The LLM

course and the agent course are some of

the best courses out there on AI and

machine learning right now The LLM

course goes from transformers all the

way up to fine-tuning And then the agent

course covers everything from intro to

agents to to a lot more So that my

friends was hugging face and we've

toured every major section of the

platform Whether you are just curious

about AI want to use existing models or

are developing something with AI or want

to contribute hugging face is definitely

a platform to check out Now go explore

And if you liked this video and found it

helpful please hit that like and

subscribe button to get more tech and AI

content And drop a comment if you have

questions and which AI platform I should

cover next And thank you for watching

See you next timehey everyone and welcome back to my

channel where we talk about cloud tech

and AI And today we're diving into a

platform that you must know about if

you're doing anything to do with AI It

is Hugging Face Now Hugging Face has

been called the GitHub of machine

learning and for all the great reasons

It is literally becoming the community

where AI models and creations are shared

across everybody So by the end of this

video you will understand what it is

exactly and why it matters to you even

if you are not somebody who codes every

day So stick around All right So this is

Demo

the hugging face homepage And the first

thing that you'll notice in here is

their tagline which is the AI community

building the future That really sums up

what they're really about right it's a

collaborative platform where people are

sharing AI tools models data sets and

even AI apps Now if you scroll down

you're able to see features models that

are trending and recently uploaded

content Um and this gives you the taste

of what's popular in the AI community

And before we dive deeper I would

recommend that you sign up and create an

account because you would need one Uh

most of the content you'll be able to

just see um without uh having an account

but if you want to use the models and

save your favorites and things like that

you will need an account Now um going

into the models tab this is where all

the models are found You can see that

they've got millions of models in here

and you can filter them by different

tasks in the categories like natural

language processing and classification

and audio and tabular

And you could also filter them by

libraries and data sets and languages

and

licenses Now here let's check out u

Microsoft's popular um 54 reasoning

model And I wanted to see how far I can

go with this So each model has got its

own page with documentation and I

clicked on deploy and it literally just

took me right away into the machine

learning studio in Azure AI and I was

asked to create a workspace and as soon

as I gave it all the details with the

name and everything um it was able to

create that workspace for me and

deployed that model the 54 model um from

hugging face into Azure AI machine

learning studio

That was absolutely amazing I just it

just took me a few clicks to do this And

you can see it's creating that now And

once it is created I can go to the

workspace And in this workspace if I

click on endpoints if I click on

endpoints I'm able to go into Azure

OpenAI service And that's where my 54

endpoint is And if I want to use this

endpoint I can click on continue And

that takes me into Azure AI Foundry

where I'll be able to um experiment with

this with this deployed model It tells

me my target URL the key that it created

for me I'm able to see how to use this

with my API key and um some samples of

how to use this model Um I am also able

to go play with it in the playground and

test it out So I gave it a prompt Um and

um I was really trying to go with like

the dog traveling uh to the mountains

where he meets a robot and um that robot

is helping a bird um survive in the cold

um and they all become friends for life

So um I was just playing around with

with um a prompt but the idea here is

that you're able to go from looking at a

model in hugging phase to actually able

to deploy that model in Azure AI found

uh foundry and uh Azure AI machine

learning studio Um and then I clicked on

deploy that um endpoint as a web

application

And right now that is what you're seeing

um with the Azure AI web application

being created um as a part of this

deployment It's able to deploy a web

application right from that um that

model and endpoint that we just created

Once the model this takes seconds maybe

a minute or so to to get created with

the deployment assets and stuff things

like that And once the app is deployed

I'm able to see that app in Azure AI

foundry in my web apps section There it

is the 54 experiment I click on that app

and there we have it An entire chat

application built from hugging face

choosing a model 54 reasoning goes into

Azure AI machine learning studio foundry

and builds it out for me as a web

application Going back into our hugging

Walkthrough

face interface we're able to uh let's

look at the the the data sets tab Now

this tab is where all your data sets are

There are thousands of these in there

and um you can preview the samples in

there You can also filter the data sets

of through languages tasks libraries all

of that Um and then if you click on one

you're able to actually see the samples

of that of the data set um and start

using them Um and now the next thing is

one of my favorites which is the spaces

section of hugging face Now this is

where things get really really exciting

especially for non-coders So spaces is

this interactive AI application that

anyone can use right within the hugging

face browser experience And think of

them as like readytouse AI tools I

clicked on one here which is called

describe anything um in uh by created by

Nvidia And when you go into that model

again right in the browser I'm not doing

anything else I can upload an image And

uh once I do that um I can type my

description and I can get my description

for for the regions of my images This is

my dog sitting on a chair in a park And

um I selected different parts of that

image And um this model is able to this

demo is able to tell me what are in

these different parts Um I selected the

tree first and then I selected my dog

himself and uh it was able to do a

really good job at telling me what is in

this image Um and if I wanted I can take

this space and deploy it for myself

whether locally or um or in cloud Um and

but before we do that let's look at

another example So I go back into my

spaces I can really um you know

categorize by image generation 3D

modeling all the different options up

top Um and I went into stable diffusion

which is another one of the very common

and very popular libraries in gener of

uh image generation models And um I

tested this one out right here in spaces

with a prompt serene lake at sunset with

mountains in the background and a golden

retriever watching the sunset I let it

generate the image And there we have it

Um I don't know if I like the first one

The second one's okay

Um but it it did what I wanted it to do

Um and let's say I'm happy with with

what it's I love the third and the

fourth images Um they really do what I

asked it to do The the good part the

best part the part that I want to show

you is I can run this space Let's say I

like it I can run it locally I can run

it um I can clone the repo um and um and

start working with it right from here

just like how we deployed the um the

five for model in Azure AI and uh with

that um let's look at the docs the docs

section is uh your knowledge center this

is where you are going to get deep

deeper technical information the docs

are organized by different categories

like the client libraries deployment

interface core ML libraries like the

transformers which is one of the very

famous libraries diffusers tokenizers um

and a lot more like radio Um and then

the next thing the last thing I want to

talk to you about is the community

section This is where people ask

questions and share ideas and learn The

blog part of the of the community is

amazing you'll see a lot of people

contributing to the blogs and you'll see

um what's happening right now um and and

what's hot right now Then the learn

section is one of my favorites The LLM

course and the agent course are some of

the best courses out there on AI and

machine learning right now The LLM

course goes from transformers all the

way up to fine-tuning And then the agent

course covers everything from intro to

agents to to a lot more So that my

friends was hugging face and we've

toured every major section of the

platform Whether you are just curious

about AI want to use existing models or

are developing something with AI or want

to contribute hugging face is definitely

a platform to check out Now go explore

And if you liked this video and found it

helpful please hit that like and

subscribe button to get more tech and AI

content And drop a comment if you have

questions and which AI platform I should

cover next And thank you for watching

Read More
What is Synthetic Data and how to use it effectively in your AI Projects
Priyanka Vergadia Priyanka Vergadia

What is Synthetic Data and how to use it effectively in your AI Projects

Researchers predict we'll exhaust all fresh text data on the internet in less than 30 years. This looming "data cliff" is why synthetic data is becoming the secret sauce of AI development—our escape hatch from running out of training material.

If you're working with AI systems or curious about how modern language models are trained, understanding synthetic data isn't just helpful—it's becoming essential. Let's dive into what it is, how it works, and why it might be the key to AI's future.

Read More
C-Suite’s Guide to Building Lasting ROI with AI Investments
Priyanka Vergadia Priyanka Vergadia

C-Suite’s Guide to Building Lasting ROI with AI Investments

How to build lasting ROI on AI investments: Every meeting I am in, the customer executives are asking: “What’s the ROI on my AI project?” The honest answer I have to share is: you won’t see it in a few days—or even a few months. That’s because AI, unlike a traditional technology rollout, is not a one-off project. It’s a habit. And like any habit, it takes time, commitment, and cultural change to form—before the real value emerges.

Traditional project approaches frame AI as a one-time initiative with defined start and end points, typically measured in weeks or months. In contrast, the habit approach recognizes AI as an ongoing process of integration into daily workflows that spans months to years. Research on habit formation indicates that individuals require an average of 66 days to form basic habits, with a range of 18-254 days depending on complexity. For organizations, this timeline extends considerably longer—typically 120 days for organizational changes and up to 365 days for full AI integration.

Read More
AI Created It, But Who Owns It?
Abhishek Sharma Abhishek Sharma

AI Created It, But Who Owns It?

Navigating the complexities of AI copyright can feel like stepping into a legal minefield. As generative AI tools become essential for creators, artists, and businesses, the question of ownership is more critical than ever. Who holds the intellectual property rights to AI-generated content? Is it you, the AI developer, or does it belong to the public domain? Our latest post demystifies the current state of AI copyright, exploring key court cases, the debate around "authorship," and the crucial concept of "fair use." Get the clarity you need to create and innovate with confidence.

Read More
OWASP Top 10 for LLMs and GenAI Cheatsheet
Priyanka Vergadia Priyanka Vergadia

OWASP Top 10 for LLMs and GenAI Cheatsheet

The OWASP Top 10 for Large Language Models represents the most critical security risks facing AI applications in 2025. As LLMs become increasingly embedded in applications across industries, understanding and mitigating these risks is crucial for developers and security professionals. In this article let’s go over an AI application architecture covering each of the OWASP Top 10 for LLMs and understand the prevention methods for each.

Read More
What is Model Context Protocol (MCP)?
Priyanka Vergadia Priyanka Vergadia

What is Model Context Protocol (MCP)?

To understand Model Context Protocol (MCP), let's start with a familiar concept: APIs in web applications.

Before APIs became standardized, web developers faced a significant challenge. Each time they needed to connect their application to an external service—whether a payment processor, social media platform, or weather service—they had to write custom code for that specific integration. This created a fragmented ecosystem where:

  • Developers spent excessive time building and maintaining custom connectors

  • Each connection had its own implementation details and quirks

  • Adding new services required significant development effort

  • Maintaining compatibility as services evolved was labor-intensive

APIs (Application Programming Interfaces) solved this problem by establishing standardized ways for web applications to communicate with external services. With standardized APIs:

  • Developers could follow consistent patterns to integrate services

  • Documentation became more standardized and accessible

  • Updates to services were easier to accommodate

  • New integrations became significantly faster to implement

MCP addresses the exact same problem, but for AI applications.

Just as APIs standardized how web applications connect to backend services, MCP standardizes how AI applications connect to external tools and data sources. Without MCP, AI developers face the same fragmentation problem that web developers faced before standardized APIs—they must create custom connections for each external system their AI needs to access.

What is MCP?

Model Context Protocol (MCP) is an open protocol developed by Anthropic that enables seamless integration between AI applications/agents and various tools and data sources. Think of it as a universal translator that allows AI systems to communicate with different external tools without needing custom code for each connection.

Read More
Latest Large Context Model (LCM) Benchmark Explained: L-CiteEval
Priyanka Vergadia Priyanka Vergadia

Latest Large Context Model (LCM) Benchmark Explained: L-CiteEval

Latest Large Context Model (LCM) Benchmark Explained: L-CiteEval

Dec 27 

Written By Priyanka Vergadia

As language models continue to evolve, one of the most significant challenges has been handling long-form content effectively. In this article, we'll explore how modern Large Context Models (LCMs) are pushing the boundaries of context windows and what this means for developers working with AI applications.

The Evolution of Context Windows

The landscape of context windows in language models has evolved dramatically:

  • GPT-3.5 (2022): 4K tokens

  • Claude 2 (2023): 100K tokens

  • GPT-4 (2024): 128K tokens

  • Claude 3 (2024): 200K tokens

  • Gemini Ultra (2024): 1M tokens

  • Anthropic Claude (experimental): 1M tokens

This exponential growth in context window sizes represents a fundamental shift in how we can interact with AI systems. For perspective, 1M tokens is roughly equivalent to 750,000 words or about 3,000 pages of text.

We’ll explore and understand LCM with the help of a recent research paper (L-CITEEVAL: DO LONG-CONTEXT MODELS TRULY LEVERAGE CONTEXT FOR RESPONDING?), which highlights the importance of large context with benchmark.

Read More
ONE life change I wish I'd made sooner in my tech career
Priyanka Vergadia Priyanka Vergadia

ONE life change I wish I'd made sooner in my tech career

There's a lot I accomplished professionally this year – from an amazing job change to completing 4 more terms of my MBA, learning countless new things, and exploring more of the world. While I share my professional journey on LinkedIn, this blog post is different. It's raw, even a bit vulnerable. I'm sharing this story hoping it might inspire even one person. This year taught me a profound truth: health truly is wealth. Here's my journey. Read on..

Read More
RAG Cheatsheet
Priyanka Vergadia Priyanka Vergadia

RAG Cheatsheet

Ever wondered why sometimes you get misleading answers from generic LLMs? It's like trying to get directions from a confused stranger, right? This can happen for many reasons, some of them are that the LLM is trained on data that is out of date, it cannot do the math or the calculations or it is just hallucinated. That is where RAG comes in.

Read More
What is Agentic RAG? Simplest explanation
Priyanka Vergadia Priyanka Vergadia

What is Agentic RAG? Simplest explanation

Traditional RAG systems, while foundational, often operate like a basic librarian - they fetch relevant documents and generate responses based on them. Agentic RAG, on the other hand, operates more like a research team with specialized experts. Let's dive deep into when and why you'd choose one over the other.

Read More
Top 11 AI Coding Assistants in 2024
Priyanka Vergadia Priyanka Vergadia

Top 11 AI Coding Assistants in 2024

As a software developer in 2024, you've probably noticed that AI has fundamentally transformed the way we write code. Gone are the days of endlessly googling syntax or scrolling through Stack Overflow for basic implementations. AI coding assistants have emerged as indispensable tools in a developer's arsenal, promising to boost productivity and streamline the coding process.

But with so many options flooding the market, choosing the right AI coding assistant can feel overwhelming. Should you go with the popular GitHub Copilot, or explore newer alternatives? Is the free tier sufficient for your needs, or should you invest in a premium solution?

This blog is my attempt to explore the current landscape of AI coding assistants, helping you make an informed decision based on your specific needs and circumstances. I will say there are many more AI coding assistants out there, I am only covering a few more well known ones here.

Read More
How Cloudflare Stopped the Largest DDoS Attack in History in 2024
Priyanka Vergadia Priyanka Vergadia

How Cloudflare Stopped the Largest DDoS Attack in History in 2024

Two weeks ago something huge happened in tech! Cloudflare, cloud platform that offers DNS and DDoS protections service, auto mitigated a 3.8 Tbps DDoS attack. To put that in perspective, imagine downloading 950 HD movies... every single second. That's the kind of digital tsunami Cloudflare was up against.  Let’s demystify what goes into mitigating an attack of this magnitude. Before we understand that, let me start by sharing how DDoS attacks work. 

Read More
How to effectively use NotebookLM as a Student
Priyanka Vergadia Priyanka Vergadia

How to effectively use NotebookLM as a Student

As an MBA student at the Wharton Business School, I've been using NotebookLM, a game-changing AI tool that has transformed my approach to learning. This AI tool has become an indispensable part of my study routine, particularly when tackling complex case studies and course materials. In this blog post, I'll share my experience and offer insights on how students can leverage NotebookLM to enhance their academic journey.

Read More
3 Hidden Skills Big Companies Teach You!
Priyanka Vergadia Priyanka Vergadia

3 Hidden Skills Big Companies Teach You!

While the decision to work at a large corporation or join a startup is a personal one, the experience of navigating a big company ecosystem offers unique opportunities for professional growth. The skills developed in this environment – from mastering the corporate zoo and managing up and sideways, to building diverse networks and finding common ground amidst competing priorities – are invaluable skills that transcend any single job or company. These competencies not only contribute to our success within the organization but also enhance our overall professional toolkit.

The challenges of bureaucracy and complexity in large companies, often seen as drawbacks, can actually be catalysts for developing patience, persistence, and creative problem-solving skills. As you progress in your career, you'll find that the ability to navigate complex organizational structures, influence without direct authority, and align diverse interests are highly transferable skills, serving you well whether you stay in big business, venture into smaller companies, or even start your own enterprise. So, while it may sometimes feel like you're a small cog in a giant machine, remember that you're simultaneously honing a set of powerful, often overlooked skills that will propel your career forward, regardless of where your professional journey takes you next.

Read More
What is Platform Engineering? How is it different from DevOps?
Priyanka Vergadia Priyanka Vergadia

What is Platform Engineering? How is it different from DevOps?

Have you ever wondered how Spotify manages to recommend the perfect song for your mood, or how Uber can connect you with a driver in minutes, anywhere in the world? These seamless experiences aren't just magic, and they're not solely the result of AI. They're powered by a powerful approach to infrastructure and applications called Platform Engineering. Let's dive into what Platform Engineering is, how it differs from DevOps, and why it's becoming the secret weapon to increase productivity in tech companies

Read More
3 Reasons Why I quit my job as Google’s Chief Developer Advocate
Priyanka Vergadia Priyanka Vergadia

3 Reasons Why I quit my job as Google’s Chief Developer Advocate

3 Reasons Why I Left Google

After seven incredible years at Google, I've decided to embark on a new adventure. In this post, I'll share the three main reasons behind my decision to leave one of the most coveted jobs in tech. This isn't just about my journey – it's a reflection on career growth that might inspire you to reassess your own professional path.

Read More