The Best AI Example of 2025 - Automate Quotes and Estimates.

Ryan Flanagan
Dec 08, 2025By Ryan Flanagan

TLDR: You can take a photo of your handwritten notes, upload it to ChatGPT, and get clean text you can copy, search, and reuse. That is the basic feature. Where it gets interesting is what happens next. A building assessor can turn those transcribed notes into inspection reports, quotes, calculations, and customer emails with very little extra work, using a simple automation spine based on image transcription, field extraction, and document generation . If you still retype notes from a notebook into a report at the end of the day, this is the bomb.

What problem does digitising handwritten notes solve?

Handwriting is fast at the point of capture and painful everywhere else. You see the same pattern in most roles that involve site work or meetings:

Notes are taken quickly in a notebook or on a sticky. Later, someone has to decipher and retype them into a system, document, or email. Details get lost, misread, or delayed.

It shows up as:

  • Reports that lag behind the work.
  • Quotes that miss small items from the site visit.
  • Compliance records that are incomplete.
  • Extra admin time no one budgeted for.

Digitising fixes the bottleneck between “I wrote it down” and “I did something with it”.

How do you use ChatGPT to turn handwriting into text?

The base workflow is simple and does not need any extra software beyond ChatGPT.

  • Open ChatGPT and start a new chat.
  • Click the image or paperclip icon next to the message box.
  • Upload a clear photo of your handwritten notes.

Type a short instruction such as:

“Extract the text from this image.”

Wait for the output, then skim it and correct any obvious errors. You now have digital text. You can paste it into your notes app, a document, your CRM, a spreadsheet, or a task tool.

Two small rules that make this reliable:

  • Take the photo in good light, with the whole page in frame.
  • Avoid shadows or glare across the writing.

That is enough for ChatGPT to get very close to word for word.

How accurate is ChatGPT with handwriting?

In practice, it reads most everyday handwriting surprisingly well. Cursive, print, quick scrawl, strike throughs, mixed caps.

The main failure modes are:

  • Blurry photos
  • Very faint pen or pencil
  • Extremely cramped or stylised handwriting (Doctors need not apply)

You still need to scan the results. The check is quick, though. You already know what the notes should roughly say, so errors stand out. The important difference is that you are reviewing, not retyping. 

The Building assessor and automation

Take a building assessor doing a standard site inspection.

They capture:

  • Property address and client details
  • Observed defects and risks
  • Materials and condition notes
  • Measurements and locations
  • Recommended actions and priorities

Traditionally, this lives in a notebook. Back at the office, the assessor rebuilds the whole visit into:

  • An inspection report
  • A quote or estimate
  • A summary email to the client or builder

That transfer from notebook to system is slow and where errors creep in. You can use the approach in your automation document as the spine for fixing this . Here is what it looks like when applied to a building assessment workflow.

1. Capture: handwritten notes plus photos
The assessor keeps using their notebook on site. At the end of the visit, they:

  • Take photos of each page of notes
  • Optionally attach reference photos of key defects

Those images go into a simple intake point, such as a form or table record, with fields for:

  • Client name
  • Property ID
  • Photo upload

The only change in behaviour is taking photos instead of carrying the notebook back to the office unopened.

2. Transcription: ChatGPT turns photos into text
An automation tool monitors new entries and, when it sees a new record with photos, it sends each note image to ChatGPT with a clear instruction:

“Transcribe these handwritten inspection notes.”
The output is stored back into a “raw notes” field in your data source. You now have the entire notebook page as plain text attached to that job .

3. Extraction: pull out the fields you always need
From there, the same automation can run a series of short prompts over the transcribed text, each focused on one piece of information:

  • “Extract the client name from these notes.”
  • “Extract and format the property address.”
  • “List all defects observed with location and brief description.”
  • “Extract all measurements and associate them with the described area.”
  • “Write a concise narrative summary of the inspection suitable for the opening of a report.”

Each response is written back into its own field: client name, address, defect list, measurements, narrative, and so on . Now you have structured data instead of a wall of text.

4. Output: generate reports, quotes, and client emails
Once the structured fields exist, you can feed them into templates. Your document shows this pattern clearly using estimates as the target output: transcribed notes plus extracted fields flow into a standard document template, and a file link is stored back against the record .

For a building assessor, that can look like:

  • A formatted inspection report in a standard layout
  • A draft quote, with defect items and quantities pulled through
  • A draft client email that explains the findings in plain language

The automation creates the drafts. The assessor reviews, adjusts wording or pricing, then sends.

5. Why this is “quietly powerful” for a building assessor
Nothing about the site visit changes. The assessor still writes by hand.
What changes is everything after they leave the property:

  • Less time retyping notes
  • Fewer missed items in reports and quotes
  • Faster turnaround for clients
  • Better records if you are ever audited or challenged

If you ignore this and keep the manual process, the failure modes stay the same: delayed documentation, inconsistent detail, higher admin load, and more room for dispute.

 This is still an AI-assisted workflow, so you need some basic discipline:

  • Always have a human review the outputs before anything goes to a client or regulator.
  • Decide what data is suitable for this workflow under your policies.
  • Keep an audit trail. Store the original note images, the transcribed text, and the final report or quote together.
  • Document the prompts you use for extraction, so the process is repeatable and explainable.

Used this way, ChatGPT is not “doing the work for you”. It is removing the transcription grind so you can focus on judgment and quality control.

FAQs

Q: Do I need any special handwriting style for this to work?
A: No. As long as the photo is clear, ChatGPT usually handles normal handwriting well. Only very messy or faint notes cause regular issues.

Q: What if my inspection spans several pages?
A: You can upload multiple images in one go. The model will read each page. Your automation can combine or separate them as needed.

Q: Can this misread measurements or technical terms?
A: Occasionally, yes, which is why human review is mandatory. The benefit is that you are correcting the odd mistake, not typing every line from scratch.

Q: Do I have to rebuild my whole system to use this?
A: No. You can start with one simple workflow: notes in, transcription, extraction of a handful of fields, and a single report template. You can expand from there once it works.

Q: What if my organisation has strict data rules?
A: Then treat this as you would any cloud tool. Involve whoever owns data governance, define what can and cannot be sent, and set rules for retention and access.

If you want to stop at “photo to text”, you can implement that in an afternoon. If you want to go further and design reliable, auditable workflows around it, you need a bit more structure.

The AI Fundamentals Masterclass is built for that. It focuses on simple, concrete use cases like this, shows you how to pick the right tasks, and helps you design flows that respect your constraints instead of fighting them.