Somebody nicked our credit card number and Bank of America sent new cards. Cool. Then I have to manually update Apple Wallet on like 6 different devices… not as cool.

(HBO) Max app: If you liked “Julia”, you may also like “Rap Sh!t”!

Me: Sure sure sure.

TIL: AVCaptureDevice documentation says the ‘uniqueID’ is ‘a unique identifier that persists on one system across device connections and disconnections, application restarts, and reboots of the system itself’. Unless you plug that webcam into a different USB port. Then the Mac is all ‘Woah! WTF is this thing?!’

Because the good hay is on the bottom. Apparently. #RabbitsOfMastodon

A German Lop rabbit completely covered in hay as she digs for the best piece at the bottom of the pile.

I have a project that failed to build under Xcode 15. It links a 3rd party framework written in C++ but the link step fails with unresolved symbols. This classic mode flag for the linker fixed it right up.

mastodon.social/@jamestho…

Just working on some recreational Ansible. As you do.

Wait, Star Trek: Strange New Worlds s2e10 is the end of the season?

Oh no.

Python in Excel! But you have to write your python in the little Excel formula box? Like getting a Ferrari but having to drive it with an app on your phone.

techcommunity.microsoft.com/t5/excel-…

It’d be cool if the Apple Card (like the physical card) worked with Find My, so my wallet doesn’t have to have a dog tag.

That thing where the list of states on a web form is alphabetized by state name, but display text is the two letter abbreviation… so the list goes NV, NE, NH, NJ, NM, NY, NC, ND. Who looks at that and is like, yeah, we got it!

Am I reading this right?

techcrunch.com/2023/08/1…

Disney+ and Hulu are getting big price increases, $3 a month for both services on the ad-free tier. So if you subscribe to both, it currently costs $26 and that’s getting bumped to $32. BUT they’re adding a bundle of ad-free Hulu and Disney+ for $20. So if you subscribe to both (like me!), this is actually going to save $6?

Office 365

I’ve been moving SonicBunny Software from Google Workspace to Office 365 over the last few weeks. I like 365 and it offers a lot of features, but the time it takes to make a change is a bit of a shock. Configuring Exchange and so many things take 5 or 10 minutes to kick in. I guess Google is similar, but with a small installation, it seems like Google was nearly instant where O365 really does take several minutes. But I just hit the kicker – ‘Please allow 24 to 48 hours for this to take effect.’ Yowza!

First hard drive, 10s of megabytes. Latest hard drive, 10s of terabytes. My brain can’t really fathom how much more that is.

A side dish of “cucumber salad” is fine, but order a “bowl of pickles” and people look at you funny.

HVAC died today on (almost) the hottest day of the year, 8 days after it’s annual service. Luckily, the AC peeps were out in about an hour! Turned out to be a loose wire in an electrical box — easy fix. 🥵

www.allamerican-nc.com

iPhone rerouted us over the weekend to avoid “severe weather”. First time I’ve seen that. It helped but we still nearly floated away.

Can you guess the TV show from a random person in the opening credits?

Only scored 9 out of 10. It’s the black and white western that got me.

Can you guess the TV show from a random person in the opening credits?:

Prove you recognize these total strangers from TV.

We’ve been using UniFi Talk for a while for our home phone. It’s kinda… basic. But they got SMS support a while ago! Yay! But the SMS to email relay has a delay of usually 10 minutes. Boo! Not great when the repair guy texts that he’s 10 minutes out and you get the text 10 minutes later. But, turns out you can also be notified via Slack webhook, which happens immediately. Now we get instant delivery of SMS messages to our home number and only have to go through like three different systems. Yay?

Furniture was rearranged without authorization. Dragon is not amused! #RabbitsOfMastodon

Callisto, Jupyter and Mac Optimized Machine Learning – Part 2

In my last post, I looked at how to install TensorFlow optimized for Apple Silicon. This time around, I’ll explore Apple Silicon support in PyTorch, another wildly popular library for machine learning.

Setting up Callisto for PyTorch is easy! The suggested pip command is

pip install torch torchvision torchaudio

And we can do that directly in the Callisto package manager. Remember, you can install multiple packages at a time by adding a space separated list, so paste torch torchvision torchaudio into the install field and away we go!

I was looking for little example to run and compare performance of PyTorch on the Apple Silicon CPU with performance on the GPU. To be quite honest, it was difficult to find a straightforward example. Fortunately, I ran across this notebook by Daniel Bourke. Daniel works through an example training a model on both the CPU device and the MPS device. MPS is the Metal Performance Shaders backend which uses Apple’s Metal framework to harness the power of the M1’s graphics hardware. In this example, he creates a Convolutional Neural Network (CNN) for image classification and compares the performance of the CPU and MPS backends.

The bottom line? MPS is at least 10x faster than using the CPU. In Daniel’s posted notebook, he saw a speed up of around 10.6. On my machine, I saw a performance increase of about 11.1x. The best thing about optimization in PyTorch is that it doesn’t require any extra work. For Mac, the MPS backend is the default so everyone benefits from the performance boost.

In addition to TensorFlow and PyTorch, I checked some other popular Python ML libraries and to see how they took advantage of Apple Silicon. While some libraries have choosen not to pursue Apple Silicon specific optimization, all of them run correctly in CPU mode.

Callisto, Jupyter and Mac Optimized Machine Learning

We build Callisto with the mindset that Callisto is the best way to do data science on a Mac. A part of that is to helping users get the most out of their Mac hardware by using computational libraries optimized for Apple Silicon chips. TensorFlow is a very popular library for machine learning, so let’s take a look and see what it takes to use an M1 optimized version of TensorFlow with a Jupyter notebook in Callisto.

TensorFlow has a feature called PluggableDevice which let’s developers create plugins for different pieces of ML hardware. Conveniently for us, Apple has written a plugin for Metal which is heavily optimized for Apple Silicon devices like the M1 and M2 chips. Now we just have to get it installed.

You should be able to just install the TensorFlow library for the Mac and then the PluggableDevice for Metal, which you’d do with these commands:

pip install tensorflow-macos
pip install tensorflow-metal

With Callisto, you can use our fancy package manager interface and install tensorflow-macos and tensorflow-metal. Unfortunately, other package dependencies mean that pip won’t install the latest tensorflow-macos, version 2.12.0, but instead, fails back one version to 2.11.0. On the other hand, pip will install the latest version of tensorflow-metal but the PluggableDevice interface is a C API and is tightly bound to the version. While these modules installed, at runtime there’s a symbol mismatch error and the Metal plugin fails to load.

Cue montage of trying to install several permutations of these two packages.

To jump to the end, as suggested in this post on the Apple Dev Forum, more recent versions seem to have issues and falling back to tensorflow-macos version 2.9.0 and tensor flow-metal version 0.5.0 does work with no issues. Pip will install those versions with the following commands:

pip install tensorflow-macos==2.9.0
pip install tensorflow-metal==0.5.0

Don’t forget, you can specify versions using Callisto’s package manager right in the package field by adding the version specifier. Instead of just tensorflow-macos, use tensorflow-macos==2.9.0.

Now we’re up and running, let’s do some tests! We want to compare just running on the CPU versus running with the hardware accelerated Metal GPU. Here’s a little bit of code to disable the GPU accelerated device in TensorFlow:

import tensorflow as tf
tf.__version__

disable_gpu = True

if disable_gpu:
    tf.config.set_visible_devices([], 'GPU')

tf.config.get_visible_devices()

When disable_gpu is true, you should only see one CPU device in the output. When not disabling the GPU, you should see both the CPU and GPU in the output. TensorFlow doesn’t deal well with changing the visibility after the library is up and running, so to switch the state of the GPU, remember to restart your Jupyter kernel.

Now we’re ready to test! First I tried this Quickstart for Beginners from the TF website. Running this example on the CPU, it completed in 7 seconds. Enabling the GPU, it runs in 42 seconds. What, what?! It’s slower using the fancy Metal optimized GPU driver? Yep, turns out that’s right. As noted on Apple’s tensorflow-metal page, the CPU can be faster for small jobs. Well that’s a little disappointing.

Now if we look at Apple’s example on that same page, it’s got a little more heavy lifting to do. Running that on my M1 CPU, it runs in just under a half an hour at 29 minutes and 12 seconds. On the GPU, it blazes through the job in 5 minutes and 10 seconds! Cutting my run time to 1/6 of the original is defintely a solid improvement. That kind of performance spike makes all the installation headaches worth it!

With tensorflow-metal on the cusp of a 1.0.0 release, we’re excited to see how we can integrate this into our builds and include this out of the box with Callisto, but until then, these instructions should help shepherd you through a manual install.

Christian (and others) got a raw deal a lot like the folks behind Tweetbot and Twitterific. As a small dev it’s become a huge gamble to create an app based on an API that you do not own. They can (and clearly will) rip the rug out from under you.

mastodon.social/@christia…

So glad to see the emphasis on testing in the Sync to iCloud with CKSyncEngine session! More of this please.

#WWDC

developer.apple.com/wwdc23/10…

#WWDC 2023 – Keynote Post Mortem

Wow. That was a lot.

I was pleasantly surprised to see the Apple Silicon MacPro. Some folks online expressed a bit of sticker shock, but it’s a high end machine and comes with a high end price. Part of that is due to the Mac Studio. Before the release of the Studio, your desktop Mac choices were between a Mac mini and a Mac Pro. In that line up, the mini had to stretch to the mid-range and the old Intel Mac Pro started off in the upper mid range. Now the Studio is positioned to take up that “medium” position, where you’re doing serious work, but not making Avatar A20: The Final Avataring. With the Studio providing coverage for those mid-range workflows, the Pro is really only for top end jobs, especially those that require PCIe card interfaces like a fiber channel interface.

Siri didn’t get the big upgrade I was hoping for. The whole ChatGPT thing has really exploded in the last few months and that’s a little too soon for Apple to act on it in time for this year’s OS release. Siri is getting the ability to handle back-to-back requests. We’ll have to see how that plays out with the betas this summer.

Looking forward to today’s “What’s New in Xcode 15” to get a better feel for the Xcode improvements. The State of the Union seemed to focus on a lot of support for visionOS, but that may just be the shock and awe talking after the big reveal yesterday. Browsing through all the sessions, I’ve book marked 35 I’d like to see, which is a ton. Hopefully I’ll get through ten this week.

Scanning through the session topics, I see a lot of Swift / SwiftUI talks and, as you might expect, a lot of visionOS sessions. Sadly, a lot of topics I’m interested in aren’t getting a lot of attention. tvOS has one session about the Continuity Camera support. There are only a handful of sessions about iOS, iPhone, or iPad and most of those are about running your app in the visionOS environment. HomeKit isn’t mentioned at all. CarPlay and HomePod get one session each. The reality is that even at a week long, online developer conference, there’s only so much bandwidth. There’s so much for Apple to tell us about visionOS and this is their one chance, so it’s all magic googles, all the time.

Now if they just handed out some free samples.

Turns out if you register an Mac UUID on the Apple Dev Portal, but forget to set the platform to macOS, the portal decides that it is… an iPod. So I’ve got that going for me, which is nice.