AI Won't Replace You. But It Might Replace the Version of You That Isn't Paying Attention.
There is a version of this post that reassures you. That tells you accounting jobs are safe, AI is just a tool, and everything is going to be fine as long as you get your degree and pass your exams. I could write that post.
This is not that post.
I have been a CPA for close to a decade. I teach accounting in the evenings. I work in a leadership role at a firm and I am involved in hiring. And I am going to tell you something that I genuinely wish someone had told me with the urgency it deserves: the bar for entry-level accounting has moved, and a lot of students have no idea.
I say this to my students pretty much every semester now, and I mean it more each time: whatever you are doing right now is probably not enough until you actually have a job lined up. The students who wait until their senior year to start looking for internships or positions are already fighting an uphill battle, and the pace of change in this profession is making that hill steeper every year. The world is not going to slow down and wait for you to feel ready.
Let's start with what changed.
When I graduated, a C student could get a job at an accounting firm and learn along the way. The work was forgiving at the junior level because junior-level work was time-consuming enough that firms needed bodies to do it. Someone had to build the schedules. Someone had to pull the data. Someone had to set up the models. That someone was you, fresh out of school, figuring it out as you went.
I remember being at my first firm and being able to raise my hand and pick up work, responsibilities, learning opportunities, no matter how small they were. I was eager, I took everything I could get, and that approach worked because those opportunities were there for the taking. Looking back, there was a lot of room to grow into the role.
That room is shrinking.
The knowledge gap between where I was then and where I am now makes total sense. That is just experience accumulating over time, and that part has not changed. What has changed completely is how I think about getting things done. Today, when I need something done that I do not immediately know how to approach, my first instinct is not to hand it off and wait. I start asking my AI tools and my more experienced colleagues how we can get this done faster. That is the most time and cost-efficient path, and it is a completely different reflex than what I had early in my career. The technology did not just change the tasks. It changed the thought process.
Those small learning opportunities I used to grab by raising my hand? They are becoming more and more sparse, because a lot of that work is being handled differently now. That is not a complaint. It is just the reality students need to understand before they get there.
The honest framing on AI and jobs.
I think the conversation around AI tends to split into two camps that are both wrong. Either AI is going to replace everyone, or it is totally overhyped and nothing will really change. Neither of those is useful to you.
Here is what I actually observe: roles that require consistent human interaction are going to be augmented by AI. Roles that exist purely to execute a repeatable task are more likely to be replaced by it. If your entire job description could be written as a prompt, that job is at risk. If your job requires you to sit across from a client, read the room, ask the right follow-up question, or translate a complex financial picture into something a business owner can act on, you have runway. But you still have to earn it, and earning it looks different than it used to.
The Excel conversation nobody is having.
I want to talk about spreadsheets for a minute, because this is where I have seen the most dramatic and underreported shift in what AI can actually do.
For a long time, I was not impressed. I would ask AI tools to produce something functional in Excel and what came back was hard-coded garbage: numbers typed directly into cells, no formulas, formatting that looked like it was produced by someone who had heard of Excel but never actually used it. The chat explanation was fine. The file was useless.
That has changed, and I say this as someone who was genuinely skeptical.
A few months ago I needed to build a fixed asset and depreciation schedule. If you have built one of these from scratch, you know what is involved: nested IF statements, date logic, method tracking. It is not the hardest thing in the world but it is tedious, and depending on the level of detail you need, it can eat up a serious chunk of time even for someone experienced. I prompted Claude with enough context about the structure, the depreciation methods, and the outputs I needed, and within about 30 minutes I had a functional, formula-driven schedule that would have taken me longer to build manually just because of the volume of formulas involved. Not a rough draft. Something I could actually work with.
Now here is the part that matters for students: I got that result because I knew what I was asking for and I knew what a good output looked like. I could evaluate it, catch what was off, and prompt my way to something better. A first-year person who has never built a depreciation schedule would not know what questions to ask, would not catch the errors, and would not be able to push the output in a better direction. That is not a criticism. It is just how these tools actually work.
AI does not level the playing field. It tilts it further toward people who already know what they are doing.
What I see in the classroom, and on resumes.
I gave my students an assignment recently that asked them to use AI to help with their research. Some of them took that way too literally. What came back was clearly copied and pasted directly from ChatGPT: random words bolded for no reason, sentence structures that no human would actually write, a complete absence of their own voice or judgment anywhere in the document. I have seen the same patterns show up on resumes, which is a particularly alarming place to find them.
My gut reaction when I see that? It is not annoyance. It is concern. Because what it tells me immediately is that this person has not critically read what they submitted. They do not know how to evaluate an output, they do not know how to present it to another person, and there is a real risk they would copy and paste something similar into a client-facing situation without thinking twice.
Anyone can copy and paste. That has always been true. What is different now is that the stakes of doing it blindly are higher, because the output looks more polished and the errors are less obvious unless you know what to look for. I have had experiences, and I suspect most of you have too, where an AI tool gave me something confidently wrong because I did not prompt it carefully or it filled in a gap with something plausible but inaccurate. If someone shows me that their only intelligence is artificial intelligence, I am not inclined to work with them. I want my colleagues, my students, and my clients to use these tools with a critical eye: prompting thoughtfully, reading carefully, and extracting information rather than just repackaging it.
What firms are actually looking for.
From what I see and hear, firm approaches to AI vary quite a bit right now. Some are leaning in aggressively, building it into their workflows and encouraging staff to experiment. Others are still working through what they are permitted to use and how. But in either environment, walking in with genuine comfort around these tools is a real asset.
What I look for in candidates, and I suspect I am not alone, is someone who knows how to use the technology but who also has the underlying skills the technology depends on. We are not yet in a world where someone who is excellent at prompting AI can outperform someone with real experience and a working knowledge of the same tools. Experience still wins. But experience combined with AI fluency is a combination that is increasingly hard to find and increasingly valuable.
Think about what Excel meant to this profession when it became the standard. It did not eliminate accountants. It made accountants dramatically more productive and raised the floor for what was expected of them coming in the door. AI is shaping up to be that kind of shift: not a replacement, but a redefinition of the baseline. The accountants who thrive will be doing more people-facing, judgment-intensive, advisory work. The number-crunching and schedule-building will increasingly be something you direct rather than something you grind through manually. That is honestly a more interesting job. But you have to show up ready for it.
So what do you actually do with this?
Go use these tools on things you already understand. Not to skip the learning, but to go deeper into it. Build the depreciation schedule yourself first, then prompt an AI to build one, then compare the two and figure out where it did better than you and where it cut corners you would have caught. That process will teach you more about both the tool and the underlying skill than either one alone.
Work on how you prompt. Most students approach AI the way they would type a search query: short, vague, missing context. The difference between a mediocre output and a genuinely useful one almost always comes down to how well you framed the question. Give context. Specify the format you need. Push back when the result is not right. That skill is learnable and it is increasingly worth having.
And keep building your actual knowledge base, because that is what makes everything else work. The students who treat AI as a shortcut around learning are going to find themselves in a difficult spot, because the shortcut only works if you already know enough to check the work.
When I was starting out, there was more room to grow into the job. You could arrive a little rough and the environment would sand down the edges over time. That room is tighter now. The students who recognize that while they still have time to do something about it are the ones I am going to be genuinely excited to bring in.
The ones waiting for someone to explain this at new hire orientation are already a step behind.