GPT3Chat is a Coward

This entry was posted in AI, Trump. Bookmark the permalink.

4 Responses to GPT3Chat is a Coward

  1. Just me says:

    One of the local lawyer blogs asked ChatGPT to write a motion. http://sdfla.blogspot.com/

    The response won’t cause any associates to lose their jobs today – but it looks to me like it’s coming. Any predictions on where the business of law is headed in this brave new world?

    Should law schools be looking forward 10 or 20 years from now to when ChatGPT type technology makes much of the practice of law as simple as asking Siri to write your motion or contract? In other words, can law schools continue to justify (to the extent it was even justifiable before ChatGPT) to charge 6 figures for an education that may very well be obsolete for many of the current students before they hit middle age?

    • Currently, GPTChat results are not reliable enough for anyone sensible to risk their malpractice insurance. It is plausible that at some point in the future, it will be right much more often than not, which may mean that it’s a decent gamble for routine or small-$ matters. But not for anything needing originality, and probably not for even routine large-$ matters.

      There’s no question that learning how to frame good queries for an AI assistant will become an important skill at some point, but it likely will depend a lot on how the interface works, and how it was trained, so it’s too early to start trying to teach that until we have a somewhat stable, more reliable tool.

      I expect we’ll get to a time when an ‘AI associate’ has the quality and reliability we associate with foreign contract counsel (think, ‘contract lawyer from India’). Which is, only up to a point.

      What could be interesting is the utility to pro se civil clients….

  2. Just me says:

    I suspect that you’re right about the foreign contract counsel comment. But I don’t think that detracts from the larger point that there will be enormous disruption in the practice of law and that there will be far fewer jobs for lawyers.

    Two main differences between foreign contract counsel and an AI associate are: 1) the barriers to entry for use of foreign contract counsel (both practical barriers like finding one, and physiological barriers like being comfortable with the idea), and 2) timing (an AI associate will do the work now whereas a human foreign contract lawyer will do it tomorrow, maybe). That, and the possible cost savings of using an AI associate versus a foreign contract lawyer (we still don’t really know what a sophisticated AI associate will cost, but it’s reasonable to suspect that it will be WAY cheaper than an human associate, whether foreign or local) could make AI associates much more appealing than a foreign contract lawyer.

    While an AI associate is not likely to genuinely replace a human associate at a white shoe firm anytime soon, I can absolutely see the associate ranks shrinking precipitously at smaller firms. Think of the way that legal assistant roles shrunk as a result of Microsoft Word, Adobe, email, and VOIP.

    Imagine a five(ish) lawyer law firm. That firm still has assistants, but that size firm may have once had two or three times as many assistants. Most lawyers now type their own motions (remember dictation tapes), handle their own electronic court filings, and handle their own file management. Remember rooms full of file cabinets with “filing inboxes” and people spending all day sorting that paper? Now those documents get dragged and dropped into an electronic file in seconds. Even if a law firm has assistants doing that work rather than the lawyers doing it themselves, one assistant can do what many used to do. What’s a copy machine? Remember law firms who had people whose entire job was making copies all day? Small law firms don’t need a mail room anymore or staff to receive or send paper mail. Unless your small firm is a retail practice with lots of walk-ins or a volume practice (thinks hundreds of PIP cases), your firm might not have a receptionist anymore either.

    Think about how many small firms employ lawyers whose job is basically to ghost write simple contracts, demand letters, discovery requests, etc. for a boss who then proofs and finalizes the draft. The boss, or a senior associate/junior partner, does all of the heavy lifting like strategic thinking and decision making; but one or two or three junior associates spend their days mashing away at a keyboard producing passable first drafts. How many of those grunt lawyer jobs will remain? Some will survive, but not many – not at small firms.

    And to your point about pro se, how much of the AI associate’s new power will cause lower level legal work to never go to lawyers to begin with. How many clients will try their hand at AI partnership agreements, purchase and sale contracts, employment agreements, corporate resolutions, insurance claims, demand letters, child support calculations (it’s a statutory formula), etc. This kind of work is the life blood of small law firms.

    I wonder whether you are underestimating how much of this kind of work pays the bills for small firm lawyers.

    A last thought is, how this will impact lawyer training? Writing is the way that most young lawyers learn the craft – it’s how they learn to organize their thoughts and separate the wheat from the chaff. More AI associate first drafts means less writing and less human learning.

    • Michael says:

      As I said, once the tech stabilizes a bit, we’ll need to train lawyers in how to make the most of the tech.

      I remain skeptical that existing reliance on firm precedents and form books will be substantially displaced. Even if it is, the amount of work generated by reviewing the AI won’t be that much less than choosing the right form, although there may be a convenience factor. The exciting question is how much time will be saved when you need to modify the form. And we don’t know that yet.

      Suppose, however, that the optimists are right, and this is a sea-change. Then law school training will need to incorporate use of the new tools. But the skills needed to evaluate the outputs won’t change. What could change, if law firm economics change, is the number of people getting enough practice experience to become experienced master lawyers. My own guess is that to the extent we get more reliable AI lawyers, we won’t get there via generic large language models, but rather by tuning machine learning, whether an LLM base or something else, with examples of expert work. Or by running the model a few tens of thousands of time and having the output evaluated and corrected by experts. For me, the real issue is how we’ll ensure a supply of well-trained and experienced experts to fill that role if lawyer economics change so much that it constricts the pipeline or cuts out certain kinds of work for junior lawyers. Cf. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3114347 for a discussion of the issue in the medical context.

      The other thing that the optimist story leaves out is how much LLMs and ML systems in general are backward looking. They don’t do great with new things. (Overblown or not, there’s a real insight motivating Cory Doctorow, Our Neophobic, Conservative AI Overlords Want Everything to Stay the Same (1/1/2020).) Law changes quite a lot. That will be a challenge well beyond writing a memo on well-trodden ground.

Comments are closed.