Article

How to Use ChatGPT Effectively for Your Work

Artificial Intelligence, Generative AI, ChatGPT. These technologies have been everywhere for more than a year now, and almost everyone has encountered ChatGPT in some form. Nevertheless, we see few implemented and truly good GPT-based applications in the business world. But we have good news for you: With the right knowledge and application, Generative AI can speed up and simplify your work.

The tool ChatGPT developed by OpenAI can mimic human conversations and assist in numerous areas, whether it’s travel planning, creative writing, or language learning. But it doesn’t just support personal life; it has also appeared in the world of work. For example, controllers can benefit from communicating with ChatGPT in various areas. Here are a few: 

  • Writing quick and requested-length summaries of reports/analyses 

  • Translating and proofreading texts 

  • Writing data queries (e.g., complex SELECTs in SQL) 

  • Generating or summarizing text from an attached report 

Thus, generative AI can speed up, simplify, and overall support work. The problem is that not everyone uses the tool, or not everyone uses it well. Communicating with ChatGPT is theoretically easy, as we just need to write questions and receive answers. The harder part is making these questions, or prompts, optimal to get the best-generated responses. 

Prompt engineering: To get the right answers, you need to ask the right questions

We sought a solution to this problem at Horváth when we started creating a prompting guideline for our BI developers. The goal was to provide use case ideas, tips, and suggestions for writing good prompts. We collected use cases that could help colleagues in their work and created example prompts for the given topics using prompt engineering techniques. Prompt engineering helps users give precise and well-formulated questions or instructions to generative AI to get the best possible answers. Additionally, we listed a guideline for using AI enabled chatbots, possibilities, prompting tips/tricks, and known limitations of GPT models. Thus, a document was created that provides information on the proper use of GPT models according to the guidelines and supports in good prompting. 

Storming the Bastille – but do it clever

We used OpenAI’s official prompt engineering guide* when creating the example prompts and prompt tips. Some of the most important recommendations from this prompt engineering guide are: 

  • The question should be specific and detailed. 

  • Give the model a personality to get more accurate answers (“Act as a historian specializing in the Modern Era.”). 

  • Set context for the question to make the answer more relevant (“For a novel set in the 19th century, write a description of a Victorian house.”). 

  • Provide a few example question-answer pairs on the given topic. This also helps the model understand what kind of answer is expected. 

  • If you need more complex information, break the question into multiple steps. 
    -  Prompt: “Give a brief summary of the French Revolution.” 
    -  Prompt: “Now detail the events of the Storming of the Bastille.” 

  • Specify the length and key points of the answer (“Write a 150-word summary of the advantages of solar energy.”). 

  • Specify the style and tone (“Write a humorous story about an astronaut’s adventures.”). 

An implemented use-case in action

After creating the document with example prompts, we went a step further and implemented it into an application. In this app, one can choose from use cases – whether it’s debugging Python code or generating dummy data – and after filling in the input parameters, the program creates a prompt. Then, with a single click, the generated prompt is copied to the clipboard, and the GPT model tailored to the topic opens. The next level of product maturity could be OpenAI integration, which would allow GPT to return the answer within the application. 

Horváth Prompt Generator Homepage
Debugging in PowerBI DAX development

To debug a Power BI DAX code, the user only needs to select the Develop DAX button on the homepage and then the Debugging category. After that, they can change the pre-set prompt context and paste the problematic DAX formula and error message into the tool. 

Example Case for DAX Debugging

After filling in the parameters, the prompt is automatically generated at the bottom of the page, and by clicking the button below, the user can ask the appropriate GPT model. 

The “employment” of generative AI can happen in many ways (e.g., developing a custom internal GPT model), but one of the simplest and cheapest ways, after establishing a guideline for using AI enabled chatbots is to create a prompting guideline that supports employees in leveraging the benefits of generative AI. We are happy to discuss how Generative AI can help you on your digital transformation journey. 

* platform.openai.com/docs/guides/prompt-engineering

Cserni, T.