Showing posts with label AI. Show all posts
Showing posts with label AI. Show all posts

Thursday, 10 July 2025

Interviewing in the Age of AI – Why the Hiring Process Needs to Change

One thought I’ve been wanting to talk about for a while is how hiring should change now that AI is changing everything around us—including how people apply and how companies hire.

A Quick Look Back

A few years back, things were simpler. There was a quick screening call, maybe an onsite interview, and that was enough to decide if the person is a good fit or not. Then companies started using online tools to make the process easier: assessments, take-home tasks, Zoom interviews, and systems to auto-filter CVs (ATS). It made scheduling and logistics easier, and although there were some concerns about cheating, it was still manageable. Tools could catch it, and honestly, cheating wasn’t as common or as easy as it is now.

The AI-Driven Landscape

Today, we need to look again at every step in this process. Most CVs now are optimized using AI just to pass the ATS and make the candidate seem like a perfect fit—even if they’re not. The tasks that used to take a day or two can be done in a few minutes. The online assessments can be solved instantly with AI. All of this makes the modern hiring process harder than ever. Personally, I’ve seen cases where I was sure the person was cheating, and suddenly I’m in a full Detective Conan episode trying to prove it.

At the same time, HR teams and hiring managers are depending more and more on AI because it saves time. And it does help in some ways—it can give feedback on a candidate’s fit, suggest improvements, and even guide people on what they should learn. But the downsides are becoming a real issue.

The Paradox of AI in Hiring

For example, AI-generated CVs often include exaggerated or fake qualifications. I’m pretty sure many people just generate them, don’t even read the final version, and then submit directly. That makes ranking or evaluating real candidates harder than it should be. On top of that, the new online assessment tools track behavior and movement to detect AI usage or cheating, but sometimes they just make people more anxious. I remember before, I used to pause for a couple of minutes just to think about the best way to approach a question. Now, some systems flag that as suspicious behavior—like maybe I’m using a second device. It’s frustrating.

Online interviews aren’t better. Instead of focusing on the actual content, I often find myself trying to figure out if someone is using AI behind the scenes. It’s no longer just a technical interview—it’s like a game of mental chess.

As for take-home tasks, if you say it’s okay to use AI, then fine—it’s fair for everyone. But if you clearly say “don’t use AI,” then you’re basically punishing the honest people. The ones who play fair (and these are usually the kind of people you want to hire) end up at a disadvantage. So ironically, you end up filtering out the candidates with integrity.

Are We Hiring the Right People?

I don’t have exact data on how much AI affects hiring results, but I liked this post that shows how weird the current state is for both candidates and companies:
https://www.linkedin.com/posts/nasserjr_recruitment-process-activity-7267260107128778753-_mnF

And another meme from the same person that really hits home:
https://www.linkedin.com/posts/nasserjr_well-thanks-activity-7292851244702834689-QW0M

The real problem is that while AI saves the company time, we need to ask: Are we actually hiring the right people? And even if we find a great match, what’s the guarantee they’ll even accept the offer?

Final Thoughts

I’m not against using AI in hiring, but we need to adapt. We can’t keep applying old-school interview methods in a world where AI is involved on both sides. The system needs to evolve, or we’ll keep making the wrong calls, for both candidates and companies.

Sunday, 22 June 2025

Why Students Should Think Twice Before Overusing AI Tools in College

In recent years, I’ve noticed a growing trend: many students and fresh graduates are heavily relying on AI tools during their college years. While I’m a strong believer in the power of large language models (LLMs) — for code generation, documentation, testing, deployment, infrastructure support, and more — I want to explain why you should not become overly dependent on them during your learning journey.

1. College Is for Learning, Not Just Finishing Tasks

Most college assignments and projects have been done countless times before. So why do professors still ask you to do them?

Because these exercises are not about the final output — they’re about the thinking process. They’re designed to help you build a deep understanding of computer science fundamentals. When you shortcut that process by asking an AI to do the thinking for you, you miss the real purpose: learning how to solve problems yourself.

There are public repositories where you can copy solutions and make your projects run instantly. But that’s not the point — your goal in college is not to finish, it’s to understand.

2. If AI Can Do Your Job, Why Would a Company Hire You?

If your only skill is knowing how to prompt AI tools, you’re making yourself easy to replace.

I’ve seen many people ace online assessments — solving problems involving dynamic programming, binary search, graph theory, and more — only to struggle with the basics during on-site interviews. They couldn’t analyze the complexity of a simple nested loop or explain how to choose between two sorting algorithms.

Overusing AI creates a false sense of competence. If you constantly rely on it to get things done, what happens when you face a challenge in real life — one that requires your own reasoning?

3. LLMs Aren’t Always Reliable for Complex or In-Depth Work

Despite all the hype, AI tools are not always accurate.

LLMs can give different answers to the same question depending on how it’s phrased. They sometimes produce code with compile errors or hallucinate incorrect explanations. Unless you understand the underlying concept, you won’t be able to judge whether the AI’s response is correct — and that’s risky.

AI should assist your thinking, not replace it.

4. Don’t Treat Private Code Like It’s Public

A major concern when using public AI tools is data leakage. Once you paste your code, tasks, or documentation into an online AI model, you have no real control over where that information ends up. Future users asking similar questions might get your proprietary logic as part of their output.

I saw this firsthand with an intern we were onboarding. After being assigned a task (with no pressure or deadline), he immediately started pasting a large portion of our internal code and task descriptions into GPT. He took the AI’s response, submitted it as a pull request — and didn’t even test it.

When I asked him about a specific line in the code, he had no idea what it did. I told him clearly: do not upload internal code, models, documents — anything — to GPT. If you need help or more time, just ask. You’re here to learn, not to impress us with how fast you can finish something.

Unfortunately, he kept doing the same thing. Eventually, our manager had to send out a formal email reminding everyone not to share internal content with public AI tools. Whether it was because of this intern or others, the message was clear: this isn’t acceptable. Yet he still relied on GPT for everything, and we all agreed — he had become someone who couldn’t write a line of code without help.


Final Thoughts

AI is a powerful tool — no doubt. But if you rely on it too early and too heavily, especially during your formative learning years, you’re sabotaging your own growth. Use it to assist you, not to bypass the learning process. Learn the foundations first. Think independently. Struggle, fail, and get better.

You’ll thank yourself later — when you're the one solving real problems, not just prompting AI to do it for you.

For example: this post was mainly written by me. I used AI to review it, then I reviewed the AI’s suggestions and made further improvements. That’s how you should be using these tools — not as a crutch, but as a sounding board to help you grow.

Technical Design Document

In software development, we often move fast—especially in agile environments. But moving fast doesn’t mean skipping structure. One of the mo...