I remember the week before I started at BCG, I studied Excel at a cafe.
I tested index match formulas and pivot tables.
I knew that I would grapple with data in formats I had not seen before, and that I would need to use Excel’s capabilities to structure and analyze that data.
If I was starting a career in knowledge work in 2023 — I would study prompt engineering.
Why prompt engineering matters
The way we ask natural language questions, or prompt models — fundamentally changes their output capabilities
The first time an Associate looks at a credit agreement they might ask “what is an early amortization event”
They’re surprised when they receive a definition of what an early amortization event is — that has nothing to do with the credit agreement they’re looking at.
If they update to “what events constitute an early amortization event”? They get their desired result.
Better yet, when they update to “What events constitute an early amortization event? Provide the results in a bulleted list”
Now they get a desired result, already formatted and slide-ready.
So how can I get better
I’ll share 3 quick tips you could start with today:
1/ A quick trick is asking a generative model for a better way to phrase your query
E.g. “I am looking for early amortization events in a credit agreement, using generative AI — write me a prompt to ask the generative AI engine”
2/ Talk to the model like you’d talk to your analyst — tell it what to include and not include, provide enough context that they understand your question and task.
“E.g. Please provide a detailed explanation of potential early amortization events that are found in this agreement. Please include common triggers, how they might be defined within the agreement, and their potential impacts on both the borrower and the lender.”
Unlike semantic search — where less words are often better for retrieval.
With generative AI — you should not leave parts of your question up for interpretation. Be clear about what you’re looking for.
3/ (A classic) Break your task into steps, and tell the model to think step by step
E.g. “Let’s think step by step. First, check if the credit agreement has an early amortization event. If it does not, say “no”. If it does, then please provide a detailed explanation…”
Context windows matter (longer is sometimes better — but not always)
The model matters (we have extensive experimentation here for different tasks)
Few shot examples (sometimes) matter
Fine tuning (sometimes) matters
We’ll talk about this more in another blog.