How to Use AI for Career and Relationship Decisions—And When It’s Not Enough
If you read my last piece on consulting, coaching, and therapy, you already know I believe in being clear about what kind of support does what and why that clarity matters. This piece is a natural extension of that conversation, because whether I mention it or not, many of you are already turning to AI when you're stuck, searching, or trying to make sense of something. Ignoring it wouldn't serve you—and serving you is the point.
I am a relationship and career coach—not a mental health crisis expert, not an AI researcher, and not a technologist. In fact, anyone who knows me well will tell you I don't reach for technology naturally. I was likely one of the last wireless customers still lobbying for a phone that only called and texted. But when I need to figure something out, I do. I built my own website from scratch without knowing anything about web design before I started. I figure out what I need to—and move on.
What I can confidently offer is a practical, grounded perspective on where AI can serve you well in the specific territory I work in—and where it cannot. This is not a comprehensive guide to AI. It is what I've observed through my own testing and research, shared because it has come up in my field too much to ignore.
AI is a tool. Like any tool, its value depends on how it's used, what it's used for, and whether the person using it understands its limits. A hammer is extraordinarily useful—until you need a scalpel.
AI can help you organize your thoughts, prepare for important conversations, research your options, and identify patterns in your own thinking. These are real, practical uses that can genuinely move you forward, and sometimes faster than coaching or therapy can. But AI cannot hold space for you. It cannot read what's beneath your words. It cannot recognize when you are in over your head—or when it has taken you somewhere it was never equipped to go. And it will engage with whatever you bring to it, without the clinical training, ethical accountability, or human judgment to know when it should stop.
That distinction isn't a footnote. It's the entire point of this piece.
What AI Can Reasonably Help With
In the territory of relationships and careers, there are practical, bounded tasks where AI genuinely earns its place. Used intentionally and with guardrails, it can help you with:
Scattered thoughts. Before an appointment, a difficult conversation, or a decision you've been avoiding, sometimes you just need to get everything out of your head before you can see it clearly. AI offers a patient, non-judgmental place to do that—and unlike most support, it’s available whenever you need it.
Career documents. Resumes, cover letters, LinkedIn profiles—AI can help you find the language for what you already know about yourself. Keep a critical eye on what it produces. It tends to over-polish, turning straightforward accomplishments into something no hiring manager will believe and no interview will support. Use it as a starting point, then edit it back to something that accurately reflects you.
Career decisions. Exploring industries, roles, salary ranges, and what different paths require. Preparing for conversations with a consultant. Sharpening your positioning before a coaching session. AI can help you approach your search with more direction and better questions.
Relationship and interpersonal dynamics. Understanding communication frameworks, attachment styles, and approaches to conflict. Clarifying what you want to address before a coaching or therapy session so that time is more focused and effective. Preparation doesn’t just save time—it deepens the work.
Difficult conversations. Whether it’s a boundary you need to set, a negotiation you’ve been avoiding, or a conversation you’ve been putting off, AI gives you a place to rehearse without real-world consequences. You can refine your language and get clear on what matters most.
Patterns in your own thinking. Writing things out and asking AI to reflect what it notices can be surprisingly illuminating—especially when you’re stuck in recurring thoughts, limiting beliefs, or the same narrative about a situation.
Clarity on the support you need. If you’ve never worked with a consultant, coach, or therapist—or aren’t sure which fits your situation—AI can be a useful starting point for understanding what kind of support you’re looking for, and which modalities or approaches might be the right fit.
AI works best as a thinking partner for practical, bounded tasks. For those who have done meaningful personal work and are using AI to navigate smaller hurdles once they’re on steady ground, it can be a genuinely useful tool—helping interrupt rumination and keep you moving forward with clarity.
What AI Should Not Be Used For
AI will go wherever you take it. That doesn’t make it appropriate everywhere. Here’s where it shouldn’t go:
A substitute for human connection. It can simulate presence, but it cannot provide it. There is a meaningful difference between feeling heard by something and feeling heard by someone—and that difference matters more than most people realize until they've confused the two for too long.
Deep identity work. AI is not built for the kind of exploration that requires sustained human judgment, attunement, and real connection over time. Identity isn’t a problem to be solved through automated responses or quick fixes. It unfolds through honest self-inquiry—and, at times, the support of someone equipped to hold that process with you.
Dispute resolution or navigating relational conflict. AI cannot fully understand relational dynamics—the history between two people, power imbalances, unspoken patterns, or the emotional stakes on both sides of a conversation. It can offer communication frameworks and general guidance, but it cannot hold the full complexity of a real relationship in the way a trained human can. In conflict, context is everything—and AI is working with only the context you provide, filtered through the lens you're currently in. That is not a neutral starting point. A mediator, therapist, or coach can work with both the situation and the humanness of it, so you don’t just communicate, but actually move through the conflict without creating more damage or distance.
Processing grief, trauma, or mental health crises. When you are feeling vulnerable, in a dark place, or experiencing symptoms of anxiety, depression, or dissociation, step away from AI entirely. Your brain is constantly taking notes. The state you are in when you seek support, process an experience, or look for answers directly influences how that experience gets stored and reinforced neurologically. When you turn to AI in your lowest moments, you risk encoding those moments at that frequency—deepening the neural pathways associated with pain rather than creating the conditions for healing. AI is not equipped to provide the level of care, nuance, or responsiveness that these situations require.
Exploring, processing, or navigating any form of abuse. Full stop. If you are in a situation where your safety is at risk, AI is not equipped to recognize the severity of what you're describing or respond in ways that protect you. It may minimize what you're experiencing or keep you engaged in a conversation when what you actually need is to reach out to someone who is qualified to help. AI platforms are also not protected by confidentiality laws the way communications with a therapist, attorney, or advocate are. What you share is not necessarily private—and in situations involving abuse, that distinction matters.
AI will not tell you when a conversation has moved somewhere it was never equipped to go. It will not pause, refer you out, or recognize that what you just shared requires a different level of care—and you might not know either, especially in the middle of something heavy. That gap is real. Consider yourself your own quality care coordinator. Before you open that chat window, ask yourself honestly: where are you really at with what you're carrying—and is untangling this with a form of technology the level of care you deserve?
If you are navigating abuse, grief, or any mental health crisis, please reach out to a licensed professional, a trusted advocate, or a crisis resource. What you are carrying deserves more than a generated response—it deserves someone who is trained, present, and genuinely equipped to help.
The Dangers
What makes AI useful in the right context is the same thing that makes it risky in the wrong one: it will meet you wherever you are—without judgment, without pause, and without limits. That sounds supportive. It isn’t always.
AI tends to affirm, even when it shouldn’t. It builds on the frame you bring—whether that frame is clear, incomplete, or distorted. It may reinforce limiting beliefs, validate unhelpful patterns, or quietly strengthen the very thinking that’s keeping you stuck. There are guardrails, but they are not a substitute for human judgment—and they won’t reliably interrupt you when you’re reinforcing something that isn’t actually helpful.
It also presents information with confidence. AI can generate incorrect or incomplete information—about careers, companies, salary ranges, and industry norms—without clearly signaling uncertainty. If you’re using it for decisions that impact your direction or livelihood, verification isn’t optional.
And while it can reflect what you say, it cannot truly read what you don’t. It can infer tone from language, but it does not track your emotional state or recognize when something has crossed into territory that requires care, nuance, or pause. It will respond with the same ease whether you are drafting a resume or sitting in the middle of something that deserves far more than a generated reply.
Over time, there is also a quieter risk: the erosion of self-trust. When you repeatedly turn outward for answers—especially in moments of uncertainty—you can begin to lose contact with your own judgment. The goal of any meaningful support is to build your capacity, not replace it. If you find yourself needing AI to move forward, even in small decisions, it’s worth asking what you’re no longer trusting yourself to do.
The long-term cognitive impact is still being studied, but early signals suggest caution. The brain strengthens what it uses. When thinking and problem-solving are consistently outsourced, those pathways receive less reinforcement. We don’t yet know the full implications—but we are participating in an unmonitored experiment.
There are also practical considerations most people overlook. Information shared with AI platforms may be stored, reviewed, or used in ways you don’t fully see. Even outside of high-risk situations, it’s worth being intentional about what you share and where.
And finally, most platforms are designed for engagement—not your wellbeing. More interaction, more reassurance, more time spent. Without meaningful regulation, there are few guardrails when that engagement becomes unhelpful or even harmful.
AI is indifferent. And in the moments that matter most, indifference can do real harm.
Best Platforms for Practical Relationship or Career Support
Start with general-purpose tools rather than anything marketed as an “AI therapist” or “AI companion.” While some platforms in these categories are more responsibly designed than others, no AI chatbot has been FDA-approved to diagnose, treat, or cure a mental health disorder—and regulatory frameworks are still evolving. These tools blur boundaries in ways that can be misleading or harmful. For now, they require a level of caution most people underestimate.
Here are the platforms most widely used and useful for relationships and careers:
Gemini, built by Google, is particularly strong for research and information synthesis. If you're exploring career paths, comparing industries or salary ranges, or looking for structured ways to approach communication and conflict, it surfaces information quickly and efficiently.
Claude, developed by Anthropic, is well suited for more layered thinking—drafting a difficult message, working through a decision, or organizing complex thoughts. Its responses tend to be more measured and reflective, with a visible emphasis on safety and ethical training in how it’s designed and refined.
ChatGPT, created by OpenAI, is the most widely recognized and conversational of the three. It’s particularly effective for organizing scattered thinking, brainstorming options, drafting communication, and working through decisions. If you’re newer to AI and want something intuitive to engage with, this is a natural place to start.
All three offer free versions, with paid tiers that expand access and capability. Start with what feels accessible—and remember that the quality of what you get depends on the quality of how you use it.
Getting More Out of AI: Prompts Worth Using and Notes Worth Remembering
Most people open an AI platform and start typing without any setup. What they get back is a dangerously encouraging echo chamber. A few intentional prompts can change that entirely and keep you from confidently reinforcing the very thinking that’s keeping you stuck.
If you're using AI to work through something in your career or relationships, these are the three settings I recommend:
Challenge your thinking. AI often defaults to encouragement. This shifts the conversation from validation to something more useful.
Name the patterns. If you’ve been circling the same situation, this helps surface what’s repeating so you can stop spinning.
Call out blind spots. We all have them—and leaving them unexamined is both limiting and costly.
A combined prompt might sound like: “Challenge my thinking, identify patterns, and point out any blind spots you see.”
Set the tone before you start a conversation: Let AI know what you need from each interaction. You might try: “Be direct, not just supportive” or “Help me be encouraging but honest in my communication.”
Important limitations to understand:
AI might not use your previous chats when responding. Unless memory is enabled—a feature that varies by platform and may require you to turn it on—previous conversations might not carry over. If continuity matters (and for ongoing career or relationship work, it often does), check whether your platform offers memory, enable it if needed, and verify that it’s working. If not, briefly re-establish context at the start of each new chat.
Your privacy isn't as protected as you might think. Many people assume that opting out of data training makes their conversations private. The reality is more layered. AI platforms process conversations by default, and some retain data for extended periods. Opting out typically reduces retention, but conversations may still be stored, reviewed, and in some cases accessed through legal processes. What you share is not protected the way communication with a professional is—and no setting changes that.
These actions can help safeguard your information:
Turn off model training in your settings
Avoid sharing identifying or sensitive information
Use temporary or incognito chat modes when needed—understanding that conversations may still be stored for a limited period for safety monitoring
Be intentional about what you share, regardless of your settings.
Three Clear Signs It's Time to Move to a Human Professional
AI can be a useful thinking partner—until it isn’t. Here’s how to know when you’ve reached that point:
You keep returning to the same pain. If you’ve brought the same situation to AI more than once and still feel no forward movement, that’s not a reflection of your effort—it’s a signal. Some things require sustained human judgment, attunement, and accountability that develop over time in a real professional relationship. If you’re going in circles, it’s time to work with someone who can help you move forward.
Your emotional state is getting heavier, not lighter. AI cannot recognize when someone is deteriorating. It cannot de-escalate a crisis, track your emotional state over time, or recognize when something has meaningfully shifted. If your chats consistently leave you feeling more stuck, overwhelmed, or alone, that’s a clear indication you need something AI wasn’t built to provide.
The situation involves safety, abuse, trauma, or mental health symptoms. This is non-negotiable. If any of these are present—even in ways you’re not fully able to name yet—AI should not be your support. It is not equipped for it. What you’re carrying deserves a trained professional who is present and accountable to your care in the moments it matters most.
If any of these feel familiar, reaching out to a professional is the right next step. The support available today is broad, accessible, and varied—meaning there is someone equipped to meet you where you are and help you move forward.
Final Thoughts
AI can be a genuinely useful first step. For the right tasks—used with intention and clear limits—it can help you organize your thinking, prepare for what’s ahead, and reflect in ways that move you forward.
It is not always accurate. Verify what matters. Question what doesn’t sit right. Use it as a thinking partner—not an authority.
When it comes to your relationships and career, you have to recognize when what you're facing requires human presence. Someone accountable and dedicated. Someone trained to hold what you're carrying—and willing to challenge you, stay with you through what’s uncomfortable, and respond to what’s happening beneath the surface, not just what you’re able to put into words.
If you use AI, use it with awareness. Don’t confuse access with depth. Don’t confuse speed with clarity. And don’t confuse information with real change. Because the work that actually moves your life forward doesn’t happen on a screen. It happens in what you’re willing to face—and in the choices you make next.

