Showing posts with label Interview Preparation. Show all posts
Showing posts with label Interview Preparation. Show all posts

Thursday, July 3, 2025

The AI Revolution - Are You an Early Adopter, Follower, or Naysayer?

To See All Articles About: Layoffs Reports
To See All Interview Preparation Articles: Index For Interviews Preparation

Course Link

In May 2025, Microsoft laid off 7,000 employees, which was 3% of their total workforce. However, immediately after, they made an announcement that shook the corporate world: they are going to spend $80 billion this year on AI infrastructure. Not over the next 5 or 8 years, but $80 billion within 12 months on AI data centers. This news should be very important for all of us, because this is a sign of the changes that are coming in the next 5-10 years.

In this video, we will talk about perhaps the biggest transformation of our generation: Generative AI and Machine Learning. 

## What's Happening Around Generative AI and Why It's So Important

Research is throwing up a lot of statistics. Recently, I was reading a research report which said that AI could potentially remove 300 million jobs across the world, which is around 9-10% of the total jobs that are currently available. Because of this, many people are scared that jobs will be lost and AI will take our jobs. However, the same fear was present when computers arrived, when smartphones arrived, when industrialization happened, and when the internet arrived. The same fear is being felt today. But I, being a technologist and an optimist, believe that this absolute event in global history is also perhaps the biggest opportunity for all of us.

The World Economic Forum predicts that between 2023 and 2027, there will be a 40% increase in demand for AI and Machine Learning jobs. In fact, since 2019, when it felt like we didn't even know about AI, AI-related jobs have been increasing by an average of 21% annually, faster than most other jobs out there in the world.

However, whenever there is such rapid movement, demand is very high, and supply is limited. I still remember when I was in school and college, ITES (IT Enabled Services) was blowing up. All our college students were in some call center or undergoing some training, and the demand was through the roof. You could walk into any call center interview, and your job was guaranteed, with amazing perks and a good lifestyle. You would get accent training for foreign languages, pick-up services, work in very good offices, and working hours were conducive, allowing you to study and work. It was a completely different era for 5 to 10 years. Then IT services came, and the same thing happened: people started going onsite to the US, UK, and Australia, earning in dollars and spending in rupees. It was a complete transformation.

The same thing I can see happening in AI as well. The demand for jobs is so rapid that the supply cannot keep up. In fact, in the US alone, it's expected that AI jobs will reach 1.3 million opportunities, but skilled labor is only around 640,000, nearly half of the demand that is actually required. For India, I tried to find the same report and saw one that said by 2027, there will be a shortfall of 1 million jobs. This means there will be a requirement for 1 million AI jobs, but there won't be enough people if we don't start investing in skilling people right now. Microsoft's announcement is in that direction itself.

Rat Smith, a senior leader at Microsoft, mentioned in an interview that AI is like the electricity of our generation or our age. Just like in 1900, electricity transformed everything, leading to industrialization, the light bulb, and people working longer hours, it changed the way we interacted with machines, and our jobs also changed accordingly. The same thing is going to happen with AI, where AI is ready to do so much day-to-day work, administrative work, and run-of-the-mill tasks, so that we can elevate our work and be far more useful in what we do, as against wasting time doing things that a machine or an LLM can do today.

## The Three Types of Reactions to Technological Shifts

While researching for this video, I thought about technological shifts. In my time, I have seen two of them: one, which I will definitely say is the computer itself, because I was born in 1980, so I saw the computer revolution, at least to the point where we saw the internet become such a force and an enabler. And the second, I do believe, is the smartphone revolution in 2008-2009, when the iPhone was released, which also changed the industry so massively. So I have seen these two waves, and I see that AI is going to be perhaps the third wave, at least in my life.

Whenever such a wave comes, there are three types of reactions and three types of people:

1.  **Early Adopters:** These are the people who don't resist this change; they embrace it. They see that it's impossible for people not to use this, and it would be foolish to say that every person won't be using this tomorrow. It's almost like if I had said in 2005 that there would be a phone with which we could use the internet, and because of that, every person would have a computer in their pocket, and if you move in that direction, you will make brilliant careers, people would have laughed at me and said, 'What nonsense are you talking about? Nothing is going to happen. Let's just stay on our desktops, and we are happy there.' You would have missed the wave. But there were people who were like, 'We know what's happening. We can see what's happening in the US. The world is so connected now that news from there reaches here instantly. We can see what companies there are investing in.' I am telling you that Microsoft is going to spend $80 billion, and it's just one company, and it will be spent in just one year. So imagine how important AI will be for the entire technology world. So clearly, there is a direction.

Then I was looking at this data: how long did it take these platforms to reach 100 million users? Netflix took 18 years, Spotify took 11 years, Twitter took 5 years, Facebook took 4.5 years, Instagram took 2.5 years, YouTube reached 100 million users within 1.5 years, TikTok reached it within 9 months, and ChatGPT had 100 million users within two months. Its average user base as of April 2025 is around 800 million, nearly 1 billion people. This means one out of seven people is using ChatGPT, a software that can replace the work of so many people and, of course, make work easier. This is the power, and you cannot deny it. So early adopters see these things.

2.  **Followers:** What do followers do? They look at early adopters and say, 'These people are going there; let's go there too, because something is happening there.' I will give an example from my own life: I joined ISB when ISB was 5 years old, so I would call myself an early adopter. Today, someone who follows ISB after 20 years is a follower because they see that many people have gone to ISB, it's a very good school, and you get good money, etc., so they should go. So these are people who follow. It's not that they will not win, but it's possible that their outcome will be slightly less than that of the early adopters.

3.  **Naysayers:** These are the people who don't believe that anything like this is going to happen. Even today, I meet people who say, 'AI will not replace humans. Take it in writing, my friend, within 50 years, you will see fewer humans and more AI. Our world will be around AI, and that will not be a scary or a bad world to be in. It will actually be, in my opinion, a more efficient world to live in, so that we have time for all the things that we, as humans, should have time for.'

## The Call to Action: Become an Early Adopter

Why am I telling you all this? I am telling you this because I want you to become an early adopter. Being an early adopter doesn't mean that if you didn't use AI in 2021, 2022, or 2023, you are left behind. Now, it means that if you don't embrace this fully in the next 5 years, you are now going to either be a follower or a naysayer, the third category who will definitely be fired or laid off.

To become an early adopter, what do you need to do? You essentially have to get skilled. Skilling is the most important thing. Of course, you can learn on your own, stumble, make mistakes, and achieve all this, but the truth is that this field is changing so rapidly and dynamically that getting professional help as soon as possible will be a better way to skill yourself. And that's why SimplyLearn offers the Professional Certificate Program in Generative AI or Generative AI and Machine Learning. The good thing is that this curriculum is actually designed by the E&ICT Academy of IIT Guwahati, so it comes from a very elite perspective and a certification that holds weight.

It's an 11-month program, live, online, and interactive, so it's not self-paced where you learn on your own. And if you really see, which is where I spent time, what the learning path is, what things are covered, it actually covers everything that one needs to know about Generative AI and Machine Learning right now. I am talking about this program because SimplyLearn sponsored this video, but of course, the key is for you to recognize which course will be best for you when you want to step ahead and make the investment in your skilling around Generative AI and Machine Learning. In my experience and research, I found this course to be quite complete in what it covers, and of course, the backing it has from IIT Guwahati and the fact that it also comes from a recognized platform.

I would encourage you to check out the course, see if it fits both your requirements, your aptitude, your budget, and then make a call. You will get certificates from both IIT Guwahati and IBM, who have also partnered for this course. So, the industry certification by IBM, there are also masterclasses by experts, and AMA sessions and hackathons so that whatever you learn, you can actually apply.

## The Market Potential and a Personal Anecdote

The market size of Generative AI in 2022 was about $40 billion. In the next 10 years, it is expected to reach $1.3 trillion. That's an annual growth rate of 42%. If any investment is giving you a 42% annual growth rate, take it with your eyes closed. And in my head, this is the investment to make. If we talk about India, it is said that by 2025-2026, AI will have a positive impact of $100 billion on India's GDP.

I joined Twitter in 2007. But at that time, I didn't take it seriously; it seemed like a very complicated platform. Who uses it? What kind of people are there? What do they talk about? etc. But in 2009-2010, there was a discussion on Twitter about something that people, especially in the tech world, especially in Silicon Valley, became very interested in. And I remember hearing about Bitcoin for the first time around that time. I thought it was nonsense. Now, did I have ₹10,000-₹20,000 to invest in Bitcoin at that point in time? Yes, but I didn't. Why? Because it was a technological shift where I was not an early adopter. In fact, I would argue I was a late follower because I bought my first Bitcoin around 2014-2015, and I actually became serious around 2019-2020. Yes, I am not a naysayer, but the point I am trying to make is, if I had invested ₹10,000 in Bitcoin in 2010 after seeing and reading everything on Twitter as an early follower, do you know what its value would be today? ₹2,370 crore. And even if I had lost that ₹10,000, I would not be poor today. That's the way you have to think about Generative AI and Machine Learning today.

If you learn this, invest your time in it, and diligently try to improve your skills, you may have a completely different outcome in the next 5 to 10 years from what you can get today. But if, God forbid, for whatever reason, this whole Generative AI and Machine Learning hype doesn't pan out, you will still end up good. You won't lose anything; you won't be poor; you won't be lost in your life, because that's the power of being an early adopter. When you make a move, you end up learning something new that will set you off for life. The question is, how high will you go? You won't go down. So I will encourage you to take that leap forward, invest in learning about AI professionally from a skilled place, from a certified place, from a place of repute. And because this video is sponsored by SimplyLearn, I have presented you with one course option which you can evaluate, and I think it might suit your requirements very well.
Tags: Agentic AI,Artificial Intelligence,Generative AI,Large Language Models,Technology,Interview Preparation,

Sunday, June 29, 2025

30 Interview Questions on AI Engineering (from the chapter "Introduction to Building AI Applications with Foundation Models")

To See All Interview Preparation Articles: Index For Interviews Preparation

Generate 30 questions (10 easy, 10 medium, 10 complex) along with answers from this PDF

Here are 30 questions (10 Easy, 10 Medium, 10 Complex) based on the provided PDF content, along with their answers and page references:


Easy Questions

  1. What term describes AI models like GPT-4 and Gemini?

    • Answer: Foundation Models.

    • Page: 9

  2. What enables language models to scale into LLMs?

    • Answer: Self-supervision.

    • Page: 6

  3. What are the two main types of language models?

    • Answer: Masked language models and autoregressive language models.

    • Page: 4

  4. What is the basic unit of a language model?

    • Answer: Token.

    • Page: 3

  5. Name one common AI engineering technique for adapting models.

    • Answer: Prompt engineering, RAG, or finetuning.

    • Page: 11

  6. What is the most popular AI use case according to surveys?

    • Answer: Coding.

    • Page: 20

  7. What does "human-in-the-loop" mean?

    • Answer: Involving humans in AI decision-making processes.

    • Page: 31

  8. What metric measures the time to generate the first token?

    • Answer: TTFT (Time to First Token).

    • Page: 33

  9. Which company launched the code-completion tool GitHub Copilot?

    • Answer: GitHub (owned by Microsoft).

    • Page: 20

  10. What does LMM stand for?

    • Answer: Large Multimodal Model.

    • Page: 9


Medium Questions

  1. Why do language models use tokens instead of words or characters?

    • Answer: Tokens reduce vocabulary size, handle unknown words, and capture meaningful components (e.g., "cook" + "ing").

    • Page: 4

  2. How does self-supervision overcome data labeling bottlenecks?

    • Answer: It infers labels from input data (e.g., predicting next tokens in text), eliminating manual labeling costs.

    • Page: 6–7

  3. What distinguishes foundation models from traditional task-specific models?

    • Answer: Foundation models are general-purpose, multimodal, and adaptable to diverse tasks.

    • Page: 10

  4. What are the three factors enabling AI engineering's growth?

    • Answer: General-purpose AI capabilities, increased AI investments, and low entry barriers.

    • Page: 12–14

  5. How did the MIT study (2023) show ChatGPT impacted writing tasks?

    • Answer: Reduced time by 40%, increased output quality by 18%, and narrowed skill gaps between workers.

    • Page: 23

  6. What is the "Crawl-Walk-Run" framework for AI automation?

    • Answer:

      • Crawl: Human involvement mandatory.

      • Walk: AI interacts with internal employees.

      • Run: AI interacts directly with external users.

    • Page: 31

  7. Why are internal-facing AI applications (e.g., knowledge management) deployed faster than external-facing ones?

    • Answer: Lower risks (data privacy, compliance, failures) while building expertise.

    • Page: 19

  8. What challenge does AI's open-ended output pose for evaluation?

    • Answer: Lack of predefined ground truths makes measuring correctness difficult (e.g., for chatbots).

    • Page: 44

  9. How did prompt engineering affect Gemini's MMLU benchmark performance?

    • Answer: Using CoT@32 (32 examples) instead of 5-shot boosted Gemini Ultra from 83.7% to 90.04%.

    • Page: 45

  10. What are the three competitive advantages in AI startups?

    • Answer: Technology, data, and distribution.

    • Page: 32


Complex Questions

  1. Why do larger models require more training data?

    • Answer: Larger models have higher capacity to learn; more data maximizes performance (not efficiency).

    • Page: 8

  2. Explain how AI engineering workflows differ from traditional ML engineering.

    • Answer:

      • ML Engineering: Data → Model → Product.

      • AI Engineering: Product → Data → Model (due to pre-trained models enabling rapid iteration).

    • Page: 47 (Figure 1-16)

  3. What ethical concern arises from AI-generated SEO content farms?

    • Answer: Proliferation of low-quality, automated content risks degrading trust in online information.

    • Page: 24

  4. How did Goldman Sachs Research quantify AI investment growth by 2025?

    • Answer: $100B in the US and $200B globally.

    • Page: 13

  5. What inference optimization challenges exist for autoregressive models?

    • Answer: Sequential token generation causes high latency (e.g., 100 tokens take ~1 second at 10ms/token).

    • Page: 43

  6. Why might GPU vendor restrictions pose a "fatal" risk for AI products?

    • Answer: Bans on GPU sales (e.g., due to regulations) can cripple compute-dependent applications overnight.

    • Page: 35

  7. How does the "data flywheel" create a competitive moat for AI startups?

    • Answer: Early market entry gathers usage data → insights improve products → attracts more users → reinforces data advantage.

    • Page: 32 (Footnote 21)

  8. Contrast pre-training, finetuning, and post-training.

    • Answer:

      • Pre-training: Training from scratch (random weights).

      • Finetuning: Adapting a pre-trained model (application-level).

      • Post-training: Further training by model developers (e.g., instruction-tuning).

    • Page: 41–42

  9. What are the three layers of the AI engineering stack?

    • Answer:

      1. Application development (prompts, interfaces).

      2. Model development (training, datasets, optimization).

      3. Infrastructure (serving, compute, monitoring).

    • Page: 37

  10. How did the 2023 Eloundou et al. study measure "AI exposure" in occupations?

    • Answer: % of tasks where AI reduces completion time by ≥50%. Fully exposed jobs included mathematicians, tax preparers, and web designers.

    • Page: 17


These questions and answers comprehensively cover key concepts, use cases, and technical nuances from Chapter 1 of the provided text.

Questions generated using DeepSeek Tags: Agentic AI,Interview Preparation,Generative AI,Artificial Intelligence,Large Language Models,

Thursday, June 19, 2025

Interview Questions on "Arrays" - Ch 2 - Fluent Python

All Posts on Python

Interview Questions from "Fluent Python" Chapter 2: Sequences

Easy Questions

  1. What are the two main categories of sequences based on mutability?

  2. How do container sequences differ from flat sequences?

  3. What is the key advantage of list comprehensions over map/filter?

  4. How do you prevent list comprehensions from leaking variables (Python 2 vs. Python 3)?

  5. What is the purpose of collections.namedtuple?

  6. How does tuple unpacking work in Python?

  7. What does the * operator do in tuple unpacking (e.g., a, *rest = [1, 2, 3])?

  8. Why do Python slices exclude the last item (e.g., my_list[0:3])?

  9. How do you reverse a sequence using slicing?

  10. What is the difference between list.sort() and sorted()?


Medium Questions

  1. Explain how generator expressions save memory compared to list comprehensions.

  2. How would you use a list comprehension to generate a Cartesian product?

  3. When should you use bisect instead of the in operator for membership tests?

  4. How does bisect.insort maintain a sorted sequence efficiently?

  5. Why might array.array be preferable to list for numerical data?

  6. What is the purpose of memoryview in handling large datasets?

  7. How does deque.rotate() work, and when would you use it?

  8. What happens when you assign to a slice (e.g., my_list[2:5] = [20, 30])?

  9. Why does my_list = [[]] * 3 create a list with shared references?

  10. How does the key parameter in sorted() enable case-insensitive sorting?


Complex Questions

  1. Explain the behavior of a += b for mutable vs. immutable sequences.

  2. Why does t[2] += [50, 60] raise a TypeError but still modify a tuple’s mutable element?

  3. How does NumPy’s ndarray improve performance for numerical operations?

  4. Discuss the performance trade-offs of using deque vs. list for FIFO/LIFO operations.

  5. How can memoryview.cast() manipulate binary data without copying bytes?

  6. When would you use array.tofile() instead of pickle for saving numerical data?

  7. Explain how the key parameter in sorting functions leverages stability (e.g., Timsort).

  8. How does bisect support efficient table lookups (e.g., converting scores to grades)?

  9. Why is deque thread-safe for append/pop operations?

  10. Compare the performance of array.fromfile() vs. reading floats from a text file.


These questions cover core sequence operations, performance optimizations, and practical applications from the chapter, suitable for evaluating a candidate's depth of understanding.

Wednesday, June 18, 2025

What is the Python Data Model?

All Questions From This Chapter

The Python Data Model is the framework that defines how Python objects behave and interact with each other. It’s the foundation for all built-in behavior in Python, such as arithmetic operations, attribute access, iteration, string representation, and more.

🧩 Key Idea:

The Python Data Model is made up of special methods (a.k.a. "dunder" methods — methods with double underscores like __len__, __getitem__, __str__, etc.) that let you hook into the language’s syntax and built-in functions.


🔍 Examples of Special Methods and What They Enable:

MethodTriggered byUse Case
__len__len(obj)Make object countable
__getitem__obj[index]Indexing, slicing support
__iter__for item in objMake object iterable
__str__str(obj) or print(obj)Human-readable string representation
__repr__repr(obj)Debug-friendly object display
__call__obj()Make object behave like a function
__add__obj1 + obj2Operator overloading
__bool__if obj:Truthiness of an object

🎯 Why It Matters:

  • Lets you create custom classes that integrate seamlessly with Python’s built-in operations.

  • Powers frameworks like Django, Pandas, NumPy, and more.

  • Enables writing Pythonic, intuitive, and idiomatic code.


📘 Example:

python
class Word: def __init__(self, text): self.text = text def __len__(self): return len(self.text) def __str__(self): return f"Word: {self.text}" word = Word("Python") print(len(word)) # → 6 print(str(word)) # → Word: Python

📚 Summary:

The Python Data Model is what allows Python to be flexible, expressive, and powerful. It’s the secret sauce behind how user-defined classes can behave like built-in types.

Want to dive deeper into this? Luciano Ramalho's "Fluent Python" is an excellent book focused on this very concept.

Tags: Technology,Python,Interview Preparation,

What built-in function calls __abs__?

All Questions From This Chapter

The built-in function abs() in Python calls the __abs__ dunder method on an object.


✅ Example:

python
class Temperature: def __init__(self, value): self.value = value def __abs__(self): return abs(self.value) temp = Temperature(-40) print(abs(temp)) # Output: 40

🔍 What’s Happening:

When you write abs(temp), Python internally calls:

python
temp.__abs__()

This allows you to define custom behavior for abs() on your own classes.


🧠 Summary:

  • abs(obj) → Calls obj.__abs__()

  • You must implement __abs__ if you want your custom class to support abs()

This is part of Python’s data model that allows operator/function overloading.

Tags: Technology,Python,Interview Preparation,

What are "dunder" methods? Give an example.

All Questions From This Chapter

"Dunder" methods (short for "double underscore" methods) are special methods in Python that begin and end with double underscores (__), like __init__, __str__, and __len__. They’re also known as magic methods or special methods.

These methods are used to:

  • Customize the behavior of built-in operations

  • Enable operator overloading

  • Integrate your class with Python's core language features (like iteration, context managers, etc.)


Example:

python
class Book: def __init__(self, title): self.title = title def __str__(self): return f"Book: {self.title}" book = Book("Fluent Python") print(book) # Output: Book: Fluent Python
  • __init__: Initializes the object (like a constructor).

  • __str__: Defines what str(obj) or print(obj) returns.


🧠 Common Dunder Methods:

MethodPurpose
__init__Constructor (called on object creation)
__str__String representation (print(obj))
__repr__Debug representation (repr(obj))
__len__Length (len(obj))
__getitem__Indexing (obj[i])
__iter__Makes an object iterable
__eq__Equality (==)

Dunder methods let your objects behave like built-in types and integrate seamlessly with Python’s syntax and idioms.

Tags: Technology,Python,Interview Preparation,