One notable case is China Machine New Energy Corp v Jaguar Energy Guatemala LLC [2013] SGHC 186, where the Singapore High Court considered a dispute arising from a power purchase agreement between the parties. The agreement contained a MAC clause that allowed the buyer to terminate the agreement if there was a material adverse change in the seller’s financial condition. The court held that the buyer had not established that there was a MAC and that the seller had not breached the agreement. The court also noted that the burden of proof was on the party seeking to rely on the MAC clause, and that the clause should be narrowly construed.
Another case is BNA v BNB [2015] SGHC 110, where the Singapore High Court considered a dispute arising from a share purchase agreement. The agreement contained a MAC clause that allowed the buyer to terminate the agreement if there was a material adverse effect on the target company’s business, operations, assets or financial condition. The court held that the buyer had not established that there was a MAC and that the seller had not breached the agreement. The court also noted that the MAC clause should be interpreted in the context of the entire agreement and the commercial objectives of the parties.
Overall, these cases suggest that Singapore courts will approach the interpretation and application of MAC clauses on a case-by-case basis and will require a high standard of proof before allowing a party to rely on such a clause to terminate a contract.
This sound great, easy to read and understand. What is there not to like about it? Except that the cases don’t exist! It is not that the case names or the citations don’t exist. They do but are wrong. The citation for China Machine New Energy Corp v Jaguar Energy Guatemala LLC is not [2013] SGHC 186. It’s another citation. And citation [2013] SGHC 186 is of another case. And, to make it worse, either is about material adverse change clauses. This was a similar situation with the second case cited, BNA v BNB [2015] SGHC 110.
So what is going on here? The short lay-persons answer is that the AI lied. But AIs don’t lie. All they do is trawl the internet for information (ie. the large language model) and then compile the information in a coherent manner that people understand. So how in that compilation process, the AI put together information that may not be related to one another together in a coherent readable form. To a reader, they appear as the truth. Data scientists don’t fully understand what is going on yet but called this phenomenon ‘hallucination’.
How do we avoid this? By asking the right question.
This reminds me of the sci-fi series, “The Hitchhiker’s Guide to the Galaxy.” The Ultimate Question was asked of the supercomputer “Deep Thought”: What is the answer to Life, the Universe, and Everything? After many years of computing, the answer was “42”. It was pointed out by Deep Thought that the question was wrong. Thus the non-sensical answer.
A ‘profession’ has now grown around generative IA to learn how to ask the right questions, ‘prompt engineering’. The combination of prompt engineering, domain knowledge (the human is still needed in the chain), high quality data, and AI models trained on research frameworks will birth a new scientific approach: Iterative Sciences.
As to using generative AI for legal work, thankfully as the preceding paragraph makes clear, the human is still need to provide the domain knowledge of law. The human with that domain knowledge needs to read what is generated by the AI for accuracy. Using the answer to my query above, of the four paragraphs generated by the AI, only two were useable (of sorts): the first and last paragraphs, ie. the two shortest paragraphs that contain well-written motherhood statements. Not sure how useful they are actually. In fact, my article on material adverse change clauses used nothing generated by the AI. So, I am happy to report that for the moment, we are safe.