Year of Hope

by Hadley Bradley


29th — I’m taking some time off work between Christmas and New Year. So I spent a nice relaxing hour painting this Ladybird on a flower. I painted it using acrylic paints.

Acrylic painting of a ladybird sat on a flower.

21st — In the new year I’m going to be publishing a collection of my short stories into a collection, making them available in both EPUB & Kindle formats. The idea is to sell the collection via Smash Words. I found an article by Kevin McConnell on how he published his technical book and have adapted his AsciiDoc templates to suit my requirements. I now have a process that builds the different ebook formats from a single AsciiDoc project.

Se7en Book Cover

However, I needed to create some cover art. This is what I came up with. The cover was actually produced using the AI art generator from app.wombo.art The generator is great, you enter some keywords and select an artistic style and hit generate. I took the image and then knocked up a quick Go program to add the title in decent font.

To get the three dimensional book image you see here, I uploaded the the cover image to the 3dbook.xyz and tweaked the settings.

4th — It’s been interesting keeping track of all the new announcements coming out of AWS reInvent. An interesting one for me was the announcement of the new S3 storage class Glacier Instant Retrieval. My issue with Glacier has always been around the retrieval times. Multi hours would never work for health applications. Amazon S3 Glacier Instant Retrieval delivers the lowest cost storage for long-lived data that is rarely accessed and requires milliseconds retrieval.

That instant retrieval makes it a viable option for certain types of applications that need to archive and secure data for the long term yet occasionally need access to it.

I’ve been thinking about developing a backup/archiving application for IHE Audit Messages for a while. It’s important to keep hold of the audit messages for many years to comply with IG retention rules. However, the sheer volume of messages generated in a regional health exchange is staggering. Off-loading the long-term storage to the cloud makes financial sense. You can benefit from the durability and availability that using AWS’s S3 object store offers.

I’ll be developing the solution over the next couple of months. Hopefully before the next financial year starts in 2022; I can have ATNA Vault working in a production setting.


25th — I’ve started investigating the feasibility of developing an ATNA ARR message aggregator. The server will accept ARR messages sent to a UDP port and process them by generating a ULID as the filename then persisting the raw message to an S3 bucket. It’s the first time I’ve written a UDP server, and I needed a way to send multiple messages to the server in parallel. By sending multiple messages in parallel, I can simulate a live environment.

I used the Netcat (or nc) command below to send a single transaction to the server. Netcat is a command-line utility that reads and writes data across network connections, using the TCP or UDP protocols.

nc -u -w0 9000 < pdq.xml 

Next, I installed the parallel command on my Linux development machine so that I could send four messages at the same time in parallel (almost). The command I used was:

parallel -j4 'cat {}  | nc -u -w0 9000' :::
    login.xml pdq.xml docquery.xml logout.xml

The command shown above creates four parallel jobs, each processing one of the four filenames on the right. First, the filenames get substituted into the cat command, where the parentheses are positioned. Then, the output from the cat command is piped into the nc command.

20th — At the start of November 2021, one of the largest web applications I’ve ever developed went live. As part of supporting the organisations, I’ve built an operations dashboard that lets me see all the stats and logs from the application in one place. The application uses a private S3 bucket to store cached files. This bucket has a life cycle policy to delete files twenty hours after being created. I wanted the dashboard to show the count of items in this bucket and the total size of the bucket to keep track of consumption. So I’ve wrote an article on using the Go AWS API to get CloudWatch Metrics for an S3 bucket.

18th — I’ve been working on deploying a workload stack to AWS which includes an Elastic Search domain. Before I could get the stack to deploy correctly I needed to create a service linked role so that Elastic Search could create resources within the VPC. I’m writing this here for my future self:

aws iam create-service-linked-role
    --aws-service-name es.amazonaws.com
    --profile profile-name
    --region eu-west-2

I also needed to pull out the IAM access keys and access secrets from the Terraform state. I managed to do it using this command within the deploy script.

terraform state pull |
    jq '.resources[] | select(.type == "aws_iam_access_key") |
    .instances[0].attributes' |
    grep '\("id":\|"secret":\|"user":\)'

And to pull the EC2 instance name and instance ID:

terraform state pull |
    jq '.resources[] | select(.type == "aws_instance") |
    .instances[0].attributes' |
    grep '\("id":\|"Name"\)'

10th — I had to write a script which would clear down & delete all the users from a Cognito user pool. I’ve wrote up the process on my Cognito page under the sub heading deleting all users from Cognito.

6th — So, it’s been a hectic few weeks, but the shared care record viewer that I’ve developed is now live. Two large acute trusts are now using it. Over 1,100 clinical staff are now getting the benefits of the improved system.

Below is a list of links I’ve found to tools and utilities that will help me on projects over the next month:


18th — I’ve finally got the shared care record viewer live. I’m now gearing up the tooling to support organisation go-lives throughout November. The whole application is developed following the twelve-factor app principles. One of these principles is that logs should be treated as an output stream. A twelve-factor app never concerns itself with routing or storage of its output stream. It should not attempt to write to or manage logfiles. Instead, each running process writes its event stream, unbuffered, to stdout.

As the web application is a series of Lambda functions, the logs get written to CloudWatch. Unfortunately, CloudWatch is a pain to navigate within the web console, so I wanted to find a command-line tool that could pull down the logs for each Lambda function to grep them locally.

After looking around, I decided to use awslogs which is a simple command-line tool for querying groups, streams and events from Amazon CloudWatch logs.

I created a makefile which retrieves the logs for the last hour, looking for the pattern WARN: which the application outputs as a prefix to each caught error.

since = '1h ago'

    ./awslogs get /aws/lambda/auth ALL
        --profile lpres
        --aws-region eu-west-2

After installing awslogs, my copy of the AWS CLI stopped working. So I had to run this command to force a reinstall of the CLI command-line tool.

sudo pip install awscli --force-reinstall --upgrade


26th — This week I tacked an interesting request. I wanted to see if it was possible to create an API to extract a patients Covid-19 Vaccination Record from the GP’s record. I’ve wrote up a brief description of the code I used to achieve this.

18th — At work, I’ve started working on a mobile responsive, secure web application that will allow patients to upload images and video for consultants to review remotely. The web application generates secure S3 signed URL’s to allow the patient to upload large files directly to a private, secure S3 bucket. For security, the signed URL’s have a short time life, so they can’t be used outside the application.

The application uses XMLHttpRequest to PUT the file directly into the S3 bucket and listens to the xhr.upload.onprogress event to update a progress bar providing feedback to the user.

I’ve also progressed my Terraform knowledge and have provisioned a complex application load balancer with multiple Lambda targets attached in separate target groups. As a result, I’m now confident enough that I can deploy anything into AWS using Terraform.

8th — Worked on increasing this website’s overall performance by reducing the cumulative layout shift. I managed to increase the sites performance from 68% to 100% as measured by Firefox’s lighthouse extension. I’ll be writing an in-depth article on the improvements I made soon. The lighthouse extension is handy as it also pointed out an issue I had with the content security policy and a character encoding issue with my robots file. It’s certainly worth running it on your website.

2nd — I’ve updated my notes on hosting a secure website on S3 & CloudFront. Added an example of copying a file to S3 while setting a specific content type and character set. Updated the Lambda function to include a refined Content Security Policy and added a Permissions Policy.


31st — I had a go painting today. It’s much harder than drawing with pencils. I managed to create this lighthouse scene using acrylic paints. It’s very relaxing, so I might be having another go.

Acrylic painting of a lighthouse next to the beach and sea.

However, it’s my ten year old who’s the real artist and this weekend I set him up on Instagram so that he can showcase his progression, he’s @JosephBradleyArt please consider following him.

29th — Started working on a side project this bank holiday weekend. I’ve been planning Retina for a while now, so it’s great to actually get some time to work on it. What is Retina?. Well, it’s a JPG photo inspection function which analysis the photo and extracts useful information from the image. It’s ideal for web developers who want to automate creating and cataloguing images uploaded to a website.

28th — I’ve been chipping away at the presentation I need to give on the 6th of September. It’s starting to come together and the slides look amazing when generated with Beamer. I’m glad I took the time to learn it.

presentation slide example

I’ve made a couple of tweaks to the LaTex code. For example, I’ve started using a 16 by 9 aspect ratio for the whole document.

\documentclass[11pt, compress, aspectratio=169]{beamer}

Also, when I include diagrams I’m using the following syntax to keep the aspect ratio of the original diagram while trying to use the maximum height and width of the slide.


25th — This week, I have mostly been learning Terraform. Terraform is an open-source infrastructure as code software tool that provides a consistent CLI workflow to manage hundreds of cloud services. Terraform codifies cloud APIs into declarative configuration files.

I’m mainly using it with AWS and have managed to deploy a serverless Aurora cluster an EC2 instance using a specific AMI image. I’ve also created and managed security groups specifying ingress/egress rules.

Next, I’ll be looking at deploying an internet facing load balancer and attaching Lambda functions to it.

17th — I finally switched my password manager. I’d meant to move away from LastPass for a while since they started charging for accounts. Making free users decide between only using their product on either desktop or mobile. I don’t mind paying for services like this; however £31 a year was more than I wanted to pay. So I switched to bitwarden which charges $10 per year (~ £7) for their personal premium account.

The process of moving all my passwords to bitwarden was straightforward. I exported a CSV file out of LastPass, which bitwarden imported flawlessly. The premium account gives you access to several reports that highlight week passwords or passwords reused across different websites. So I spent half an hour correcting these, going through the list of services I use and closing accounts for those I no longer use. The data breach report within bitwarden is handy. The data breach report uses the have I been pwned API to see if your credentials have been exposed in a public data breach. It highlighted a data breach from Dropbox, so I instantly went in and changed my Dropbox password.

8th — Today I needed to install my wildcard SSL certificates onto a Windows server which required converting the files supplied by DigiCert into a PKCS#12 file. PKCS#12 is a binary format for storing a certificate chain and private key in a single encrypted file. So I wrote up the process for my future self. Using OpenSSL to create PKCS#12 certificate chain.

6th — End of another busy week. Not done much actual coding this week, but have created a strategy to incorporate a structured FHIR CDR (Clinical Data Repository) into an existing XDS health exchange. The plan is to use the mXDE profile (Cross-Enterprise Document Data Element Extraction) and QEDm (Query for Existing Data for Mobile)

Worked with a supplier to help them define a Cognito user pool for HiPRES v2.

The LAMP testing project that I helped design with Stewart has just passed a huge milestone. One hundred thousands tests completed, helping to keep our staff safe.


31st — I’ve been asked to extend the LAMP lab receipting that I’ve developed. The lab wants the software to print a new smaller label on a Zebra printer when the sample is accepted into the laboratory. As the application is web based, I’m hoping to use Zebra’s browser print module which quickly adds USB or Network based printing support to your browser-based applications. It also allows printing of ZPL codes using JavaScript.

Having never written ZPL codes before, I intend to use labelary.com to help me design the visual look of the smaller labels.

21st — Today I had to create a presentation, my first in a long time. Unfortunately, as I use Linux for my operating system, I don’t have access to a visual tool like Microsoft Presentation. Plus, I find all Microsoft Presentations look the same, i.e. crap. So when I generate my system documentation, I use a proper typesetting system called LaTeX.

LaTex supports many different document class types, one called Beamer is focused on producing on-screen presentations, along with support material such as handouts and speaker notes.

The beauty of the Beamer class is its ability to support themes. I just needed to find a theme that I liked. After some searching, I found Metropolis a simple, modern Beamer theme that tries to minimize noise and maximize space for content; the only visual flourish it offers is an (optional) progress bar added to each slide.

The upside of using LaTex with Beamer is that the resulting presentation is a PDF file that anyone should be able to open and view.

19th — I spent today moving a load of cron jobs into individually packaged Lambda functions and then using Amazon’s Eventbridge technology to schedule the running of the Lambda functions. Eventbridge supports cron-like syntax for scheduling when your functions are run. You can schedule your functions by using either UTC or GMT times. However, as we’re currently in British Summer Time, I needed to adjust my Go programme to report the actual time.

This can done by using the LoadLocation method of the time library, passing in the location/timezone you wish to use.

location, err := time.LoadLocation("Europe/London")
if err != nil {
    log.Println("WARN: " + err.Error())

today := time.Now()
log.Println("Report completed : " + today.In(location).Format("02-01-2006-15-04"))

6th — Spent an hour drawing this picture titled “Swan Lake”.

Swan Lake - Pencil sketch of a ballerina dancing


9th — After not playing music with other people the last twelve months, it’s been nice to start practising with the band again. After only a couple of sessions, I’m hooked on playing and keen to devote more time to my music. So, last night I joined Carlisle St.Stephen’s Band who made me feel very welcome.


18th — Zombies. I enjoy zombie stories. Is that odd? I’ve watched every Walking Dead episode and really liked the World War Z movie starring Brad Pitt. I’ve listened to all the Mountain Man books on Audible and I’m currently reading the excellent Surviving The Evacuation 18 book series by Frank Tayell. So if you’re worried about Zombies, read the Zombie Survival Manual

10th — So I’ve hit a stumbling block on my web project. After doing some significant XCA testing today, I’ve identified that patients with extensive medical histories can push beyond the 1 MB limit on Lambda functions when they are used as ALB targets. It’s not too much of a problem for the history page as I’ve already come up with an optimisation strategy to keep the response body below 1 MB.

The main problem is that some PDF’s are in the order of 7 MB big. Too large to be used as a direct response from the Lambda function. To get around this, I’ll be learning tomorrow how to upload large documents to S3, generating a pre-signed URL with an expiration date/time to restrict access.

9th — As part of a migration project for HiPRES, I needed to start moving reference data to the new system/database. I’ve wrote up the process I used to export the PostgreSQL data to a CSV file.

Bridge Head's HiPRES

3rd — Spent an hour drawing this picture titled “First Dance”.

First dance

2nd — Spent a nice couple of hours drawing this colourful owl. Drawn using Derwent coloursoft pencils; versatile, professional-quality coloring pencils in a range of vibrant colors. Someone from Twitter asked me to draw a version of this for their newly built bar/shed. My first commission, yeah 🎨.

Pencil Sketch of Owl

The image above is displayed using a different method than the other images on this page. It’s using the picture element to present the most optimised image based on what your browser can support. It’s also using the native browser support for lazy image loading.

The image also shows a solid background colour as a fallback placeholder if the images fail to load. On devices with slow internet connectivity, this should present an image placeholder with a meaningful colour until the image loads. This is useful to prevent against cumulative layout shift which is considered bad for front-end performance.

I’m actually in the process of writing a command-line tool to make this easier for developers. Given the source input image as a JPG file, it will create .WEBP and .AVIF file formats, picking the five most prominent colours from the photograph to use as the background colour fill. It will then generate the necessary HTML code snippet to use within your web page. I’ll be writing a complete tutorial very soon.


24th — Someone I follow on twitter likes to share three positives from their day. I’ve found it useful to sit quietly and contemplate at the end of day, reflecting on what has been positive. Reframing the days negatives an to focus on those positives. Here are mine for today:

Picture of Kindle

I use curl, if not daily, certainly weekly. This is an interesting interview with the maintainer of the project as curl turns 23 years old.

17th — I’ve just finished writing a short story : Agent 13. A story I started writing for my son’s 13th birthday. Following in the tradition of the story I wrote for his Twelfth birthday. Kira Blackwood is Agent 13. Hired to track down Thomas who has just completed the heist of the decade. A heist that he hoped would make him very rich.

16th — Wow; time is flying by, not posted in a while. Now that the Covid restrictions have been lifted a bit and up to six people can meet outdoors, I met up with some fellow brass band players and had a practice session. It was really nice being able to make music with other people.

I’ve been making good progress with the Go web application and I’ve been compiling a list of notes about using Lambda functions for web applications. I’ll publish these soon under a title ‘Leveraging Lambda’. In the meantime I’ve published a short guide to AWS tagging best practices and how to automate the tagging using the AWS CLI

Here are the latest SCC stats for the web project. I think it’s time to show these to the boss.

Language   Files     Lines   Blanks  Comments    Total
Go            40      6150      811       561     4778
SQL            9       273       52        22      199
HTML           7       701       79         2      620
Makefile       7       169       43        17      109
Markdown       7        39       19         0       20
SVG            7       187        1         0      186
XML            3       146        5         0      141
CSS            1         7        0         5        2
JavaScript     1         7        0         6        1
Estimated Cost to Develop (organic) $179,015
Estimated Schedule Effort (organic) 7.153399 months


29th — I’ve been busy coding a new Go web application over the last three weeks. So far I’ve wrote almost five thousand lines of code; and according to scc the estimated effort to reproduce is 5.9 months at a cost of over Β£81,000.

Language   Files     Lines   Blanks  Comments    Total
Go            27      3760      467       344     2949
SQL            7       203       34        22      147
HTML           6       479       50         1      428
Makefile       6       143       34        14       95
Markdown       6        33       16         0       17
SVG            6       186        1         0      185
CSS            1         7        0         5        2
JavaScript     1         7        0         6        1
XML            1        50        0         0       50
Total         61      4868      602       392     3874
Estimated Cost to Develop (organic) $111,985
Estimated Schedule Effort (organic) 5.985438 months

19th — This week I had to interface with an STS (Security Token Service) server to obtain a SAML assertion for our regional Health Information Exchange. The SOAP message used to submit to the STS server needed to contain a wsse:Security header. For this I had to encode the password digest based on combining a nonce with some other key data items. I’ll be writing this up in a future post to show how I managed to do this using Golang.

12th — It’s been a couple of weeks since I last posted. I’ve been busy working on a new web application. The application is going to be made up entirely of Lambda functions triggered from an ALB (Application Load Balancer). So I’ve been learning quite a lot recently. I’m going to publish my learnings soon so I’ve got it fully documented for my future self.

I also had to quickly learn how to make my Go HTTP clients proxy aware. Turns out it wasn’t too difficult. Just a couple of extra lines of code.


21st — I’ve had a week off work and it’s been amazing. Today I learned a great tip for sending the output of a terminal command to the linux clipboard, so that the contents can be pasted into my code editor. First you need to install the xclip command then you can use the syntax go run main.go | xclip -selection c to capture the output.

13th — Found Litestream today. It’s a backup utility which streams SQLite changes to S3-compatible storage. This allows you to quickly recover to the point of failure if your server goes down. It makes a bold claim. Stop building slow, complex, fragile software systems. Safely run your application on a single server.

There are many benefits to running a start-up business on a single server. Beside the costs savings, simplifying your infrastructure to a single server reduces your time to recovery and makes it easier to move your application to a different provider. You can scale up the VM as your business grows, and many business won’t reach the scale that requires multiple servers.

But this SQLite, nobody writes production applications with SQLite, right?

Well it can be done. David Crawshaw has a conference talk (worth watching) and blog post on building single process applications on SQLite.

With Litestream continually backing up your SQLite database, it’s certainly worth investigating. I’d like to try it on a high traffic data API.

12th — I love stumbling across sites like Old Book Illustrations and reviewing old drawings from a time gone by. Some of the drawings under the technical section would make great retro covers for tech books.

Halley's Diving Bell

11th — I’ve been taking it easy this last week. The internet blocking measures I put in place for my son have now been applied to myself. This is to stop me working outside of my contracted hours. It was hard for the first couple of days, but I’m now finding it liberating. I’m also practicing saying “no”. The extra time is allowing me to read for pleasure, which I’m enjoying again.

Today, I created a quick demo video of using Audio Recorder on Linux to capture the system audio on a Ubuntu system.

8th — (❄️) Over the weekend I came to the realisation that I’m struggling with my current situation. I’ve been neglecting my own well being. I’m sad, low, exhausted and demotivated all due to burn out. The last twelve months has been unrelenting pressure with high stress workloads. The last three months have been even worse, I’ve been consistently working fifty to sixty hours a week, sacrificing many weekends to deliver projects. Something has to change. I can’t continue like this. I’ve been suffering headaches and stomach pains, all stress related.

My imposter syndrome feelings have increased over the last several months and isn’t helping the situation. Doing some research I can certainly identify with this trait:

The expert: These individuals are always trying to learn more and are never satisfied with their level of understanding. Even though they are often highly skilled, they underrate their own expertise.

So this weekend I pressed the reset button.

I took afternoon naps to recover the body, and made time for some hobbies that I have neglected. I played the cornet, read a book for pleasure, and did some pencil sketching:

Pencil Sketch of Owl

I doubt it will be a quick recovery, but I’ve made the start. I’ll be sticking to a strict thirty seven hour week and working hard to using my weekends as downtime to recover. I’ve also scheduled in some annual leave.

5th — It’s been a busy week at work. Started learning Terraform to provision cloud infrastructure as code. I’ll be revisiting some older manual build projects to redefine them as terraform scripts.

Worked with the RDS data API to allow supplier access to an Aurora serverless database using the AWS command line utility. This allowed me to keep the RDS server private and grant secure access to the supplier using their IAM access keys.


30st — (πŸšΆβ€β™‚οΈ) Out for a late afternoon walk and managed to capture this amazing sunset.

Sunset over Talking Tarn

28th — Published first draft of Writing a Windows Service in Go.

27th — Have been busy working with suppliers to design and build AWS infrastructure to support reporting from our regional HIE (Health Information Exchange). We’ll be using Logstash to centralize, transform & stash application logs and ATNA data into an AWS Elastic Search cluster. We used this tutorial on how to install and configure Logstash.

Found a good video which outlines best practices checklist for using Amazon Elasticsearch service.

21st — (πŸšΆβ€β™‚οΈπŸŽΊ) Starting to see improvement in my cornet playing; the consistency in practice is starting to pay off. Need to keep it up now, ready for a time when we’re allowed to meet as a band again. The new season of The Infinite Monkey Cage, podcast has been released. Always funny and informative. Finished watching Bridgerton on NetFlix; a fun look at London high society in the 1800’s - Very enjoyable watch.

19th — Published Sky router configuration, how to automatically disable the internet on a schedule.

18th — (πŸšΆβ€β™‚οΈπŸŽΊ) Started a two week break from work. Going to spend the time walking (weather permitting) and practicing the cornet. I’m also going to use the time to learn some new skills, in particular Terraform.

16th — Published barcode scanning using onscan.js, a look at how to create a SPA (Single Page Application) to barcode receipt / reject laboratory samples at scale.

14th — Published Amazon Rekognition. Using the Go Rekognition API to OCR text from an image and to identify subjects within a photograph. Looking at how to set this up so that I can index all my photos within my photo albums. If I use something like bleve modern text indexing in Go, I’ll be able to search for photos based on their content. Like search for all photos contain Castles.

13th — (🎺) Spent a couple of hours investigating the features of AWS Rekognition for image identification. Doing some preparation for an upcoming longer blog post. Using the image below the system identified the contents as: Bird, Animal, Jay, Finch, Blue Jay with various levels of confidence ranging from 90.49 - 99.89

Photo of a tub of ice cream with a chocolate flake

11th — (🎺) Most of the day was spent trawling through CSV files to ascertain why the files hadn’t been processed by our results ingestion service. In every instance the files either contained invalid or corrupted data πŸ€·β€β™‚οΈ. On the plus side I learned some new grep skills.

10th — (πŸšΆβ€β™‚οΈπŸŽΊ) Spent some quality time practicing the cornet. Went through several of Jean-Baptiste’s Arban Cornet Method studies, paying particular attention to tone and quality of sound. I really need to focus on getting the most out of each practice session. I particularly like the Andante con spirito on page 106.

Updated the family computer to OSX Big Sur and now the machine is really slow. We’ve had the computer for a while now and it’s probably time to try doing a fresh install of the operating system. A project for next weekend.

9th — Cold, but sunny. Time for some self care:

Photo of a tub of ice cream with a chocolate flake

7th — (❄️) Used Excelize to produce exception reports for a project I’m working on. It was quite easy to set-up with a very intuitive API. Jotted down some PostgreSQL Tips. Started listening to The Cipher podcast.

5th — (🎺) Spent the afternoon debugging an issue with a Windows NT service I’d written in Go. Evaluated possible Go packages for generating Excel spreadsheets and found Excelize. The country moves into a third national lock down due to Covid19.

4th — First official day back at work. Lethal black ice, worse I’ve ever seen. Started work building a new cloud micro-service. Configured two CentOS servers and installed Docker onto them. Provisioned a load balancer and SSL certificate. Learned how to monitor and grep the Docker logs for exceptions; so I can now set-up a cron job to monitor the logs every fifteen minutes and alert first line support if exceptions go over a threshold.

3rd — (πŸšΆβ€β™‚οΈπŸŽΊ) Jotted down some notes on how to manipulate images from the Linux command line. Downloaded and installed Krita a professional free and open source painting program made by artists for artists. Started teaching my ten year old how to use it, as he has a flare for drawing.

2nd — Switched the favicon for this site to an embedded base64 encoded image. Removing the need to make an additional HTTP request. Added a preconnect header tag, to speed-up the loading of Google fonts. Added the contact form which used the static forms service.

1st — (πŸšΆβ€β™‚οΈπŸŽΊ) Wrote some proof of concept code to submit an image to AWS’s Rekognition machine learning API to extract batch number & expiry dates from the packaging. The AI is very good id the image is the correct way up, less so if the image is rotated either 90 or 180 degrees. It’s promising. Need to test in different lighting conditions and if the packaging is not straight.

Spent an hour or so working on a new short story, Agent 13.

Started listening to The Black Tapes Podcast; a serialized docudrama about one journalist’s search for truth, her subject’s mysterious past, and the literal and figurative ghosts that haunt them both. Do you believe?