Better Programming

Advice for programmers.

Follow publication

Member-only story

Handling Core Image Filter Processing With Concurrency in Swift

Mark Lucking
Better Programming
Published in
5 min readJan 18, 2022
Photo by Julian Hochgesang on Unsplash

Over the last month, I have published a few articles on concurrent coding. It is a subject that was a central pillar in the WWDC 2021 — and I am sure we’ll be returning too in WWDC 2022.

I want, in this piece, to look into an area that is the perfect application for concurrent coding — at a framework that Apple slipped under the radar this past summer, Core Image.

The update was to the image filters that can now be invoked with as few as two lines. You can find a great reference to them under this link. A link that takes you to some excellent documentation that shows how to configure each filter with an example of what it looks like. Although I contest, there is a missing link that you’ll need to build to understand what each filter really does. This code is the missing link.

It produces a sequence of UnsafeMutableBufferPointers to the data that makes up an image. We can then do a…

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

Mark Lucking
Mark Lucking

Written by Mark Lucking

Coding for 35+ years, enjoying using and learning Swift/iOS development. Writer @ Better Programming, @The StartUp, @Mac O’Clock, Level Up Coding & More

Write a response