Building a macOS Touch Bar app using interprocess communication in Electron

Building a macOS Touch Bar app using interprocess communication in Electron

·

21 min read

In this post, I will go through the process of building a feature in a macOS desktop application that uses Electron. This feature will implement the native macOS Touch Bar API's.

Feature we need to build

I have a desktop application called TextDiffer which allows you to see the difference between two snippets of text. The app will hopefully be published soon and is currently awaiting for approval from the Mac AppStore. 🤞 Update it's available now.

The TextDiffer app has a button in the UI that says "Perform Diff" which will allow you to run the diff checker. As a user, I would like to access this action using a button in my macOS Touch Bar that says "Perform Diff."

In TextDiffer, I am also able to configure my UI theme between light and dark mode. Depending on the theme I've chosen, the "Perform Diff" button will be a different colour: orange for light mode, and purple for dark mode. As a user, I would like the colour of the "Perform Diff" button in my macOS Touch Bar to be the right colour depending on my theme.

Light and dark variants of the Perform Diff button

Prerequisites

Before getting started, you will need to understand JavaScript and how to develop basic web apps using any (or no) framework or library.

You should also be familiar with Node.js.

Having a basic understanding of how Electron works helps too.

API's

For this exercise, we will only be using the standard Electron API's including TouchBar, IPC Main, and IPC Renderer.

IPC Basics

Let's go over some basics for interprocess communication in Electron.

Processes

There are two processes that we need to concern ourselves with. If you're familiar with Electron app development, or with developing native mobile apps, you are likely already familiar with these concepts:

  • Main – the main process is the one that runs in Node.js. This is the layer that has access to the native desktop API's via corresponding Electron API's.
  • Render – the web process. This is the one that runs in the client-side web app of your Electron application. This can be an app written in Vue, React, Angular, or simply vanilla JavaScript.

About message passing

In order for the main process to communicate with the render process, and vice-versa, these processes will need to send messages to each other.

Messages in Electron have two parts:

  • the channel
  • the message data

The channel is a string value and represents the name or topic of messages. Messages are sent on a channel. In this application, I have set up two separate channels:

  • main-actions – the channel I use to send messages to the main process
  • render-actions – the channel I use to send messages to the render process

Message passing uses an event-based/observables approach, allowing you to send messages with data to a channel, and to listen for messages and subscribe to a channel, and subsequently perform actions in response to messages you receive.

You can set up your channels as you like—one single channel, one channel per action, one channel per process where actions are shared, etc.

Though I am using two separate channels—one for each process—you may be able to use a single channel to communicate both ways. I have not tested this, but if you have, please let me know in the comments how it turned out!

Listening for messages

Electron allows you to listen for events on the main process via the ipcMain module.

// main.js
const { ipcMain } = require('electron');

ipcMain.on('main-actions', (event, data = {}) => {
  // Do things on the main process
})

Similarly, Electron allows you to listen for events on the render process via the ipcRenderer module.

// render.js
const { ipcRenderer } = window.require('electron');

ipcRenderer.on('render-actions', (event, data = {}) => {
  // Do things on the render process
})

Important: You may have noticed that I am using require differently for each of these processes. While the main process uses the standard Node.js require, the render process does things differently. This is because Electron requires that you use require on the window object in the render process. It's a good idea to check the existence of it ahead of time so that no exceptions are thrown—this is useful if you build your JavaScript app as a standalone web app initially and integrate it with Electron afterwards, in which case window.require would not be present in your standalone web app.

Structuring the message data

Messages you send can have any shape. The message data is the second argument in the callback. In my example code above, I've called that argument data and I've defaulted it to the empty object {}. Your data could just be a string if you wanted, or a number, or any value. I use an object so that I can pass multiple values if needed. I'm doing this because I will be sending multiple types of messages on each of these channels.

I like to structure my message data similar to how Redux actions are structured:

  • each action has a type property which is a string that I can use to switch/case on
  • additional properties are custom, based on the action

While a simpler action may just have a type property and no additional data, a more complex action would have a type property and whatever data is required to perform that action.

const performDiffAction = {
  type: 'performDiff'
}

const initTouchBarAction = {
  type: 'init_touch_bar',
  theme: 'dark'
}

There is no requirement as to what format your type property should follow. In Redux, the community as a whole is not unanimous between camelCase, snake_case, and SCREAMING_CASE, so feel free to use whichever you prefer.

So, given that my actions are shaped as above, my listeners could look something like this in the render process:

// render.js

ipcRenderer.on('render-actions', (event, data = {}) => {
  switch (data.type) {
    case 'performDiff':
      return window.globalPerformDiff();

    default:
      // no-op
  }
})

And something like this on the main process:

// main.js

ipcMain.on('main-actions', (event, data = {}) => {
  switch (data.type) {
    case 'init_touch_bar':
      return initTouchBarWithConfig({ theme: data.theme });

    default:
      // no-op
  }
})

Sending messages

While listening for messages on the main and render processes is quite similar, sending messages is slightly different.

I should also mention that there are limitations with the data that can be sent in a message.

Your message data must be serializable. Primitives work well for this case, e.g. strings, numbers, and JSON objects that consist only of primitives. Put simply, if you are passing an object, you should be able to run JSON.parse and JSON.stringify on your data and have it work reliably.

If your data is not serializable, you will encounter some errors. One error that you may see is Error: An object could not be cloned. This can occur if you try to pass a reference to a function as part of your message data. The reference cannot be saved and you will likely encounter an error. So, put simply, you cannot just send a message to the main process with a click handler and attach that to the Touch Bar button—this will not work, i.e. this would likely fail: { type: 'doSomething', onClick: doSomething }

We need to use message passing both ways in order to implement Touch Bar functionality end-to-end.

Render -> Main

This part will go over sending a message from the render process (the client-side web app) to the main process (the Node.js app with access to the native layer via Electron).

Let's say when my app initializes, I would like to initialize the Touch Bar with the dark theme. I am choosing to do this from the render process. Doing it from the render process adds an extra step of message passing, but it also gives me more flexibility.

I could have simply done it in the main process without going through the render process, but since I want to configure the style based on the user's UI theme, I need the user to tell me this (from the browser).

So, given that I'd like to send a message from the render process to the main process, the message could look something like this:

// render.js

function initTouchBarWithTheme() {
  const theme = localStorage.getItem('theme') || 'light';

  ipcRenderer.send('main-actions', {
    type: 'init_touch_bar',
    theme: theme
  })
}

The above code will result in the init_touch_bar message being sent from the browser to the Node.js process. This would be a good spot to implement the Touch Bar API, creating a button, and configuring the colour based on the theme sent from the render process.

Main -> Render

This part will go over sending a message from the main process (the Node.js app) to the render process (the web app running in the browser).

Given that the main process is where I need to be to interact with native API's, I would set up my Touch Bar and all of its buttons on the main process. But, considering my application is an Electron app and the bulk of my logic is happening in the UI, I would likely need any user interaction with that button to be propagated to the render process.

So far we've sent a message from the render process and included the theme, which tells us on the main process that we'll need to set the button to be a specific colour. Let's implement that instruction now.

Here is some basic code for creating a Touch Bar with a single button where the colour will change depending on the provided theme:

// main.js

function initTouchBar({ theme }) {
  const window = BrowserWindow.getFocusedWindow();

  const touchBar = new TouchBar({
    items: [
      new TouchBarButton({
        label: 'Perform Diff',
        backgroundColor: theme === 'light' ? '#FF9A49' : '#302AE6',
        click: () => {
          console.log('TODO!');
        },
      })
    ]
  });

  window.setTouchBar(touchBar);
}

The code above is enough code to get a Touch Bar with a single button that changes colour depending on the theme.

That's great and all, but the button won't do anything until we implement the click handler. In this case, clicking on this button should trigger a form submit in our render process. We won't go into detail about what that form submission does or how to implement it, we just need to trigger it, which we can do by clicking on the submit button. This will propagate the event and submit the form.

There is more than one way that we can tell the render process to execute JavaScript from the main process. The two ways I'll touch on both use the webContents API:

  • Sending a message – That one is no surprise, given that's what this article is about. This is the approach I prefer.
  • executeJavaScript API – An alternative approach that I do not prefer

So, before we get into how I would do it, let's briefly talk about the alternative (i.e. how I would not do it). The alternative way this can be done is without using message passing, and instead using the executeJavaScript API, which takes a string of JavaScript, and a callback with the result. If we used this approach, it means I could look up my DOM element and trigger a gesture (like a button click) and put that code right in the argument string.

While this API would be adequate for very simple tasks, it may not be ideal in the following cases:

  • my application is using a framework like Angular, Vue, or React, as this would be outside of the framework code
  • my code is relatively complex or comprehensive

I would not personally use this approach. The full range of support via this Electron wrapper is not predictable, may be unreliable, and will likely be quite difficult to debug if you have any issues. Users have experienced some limitations with this approach, at least in its earlier days.

I would prefer to leverage message passing here as well, and send a message from the main process to the render process, and then once I receive that message, execute my code. This is because I can send a very simple message with primitive data I am confident will arrive as intended, and just make sure I'm listening to this message in my web app. The code I execute is not contained in a string, which means I can write tests or execute it in other scenarios without duplicating it into the string argument of a function call.

Message passing from the main process to the render process is also done via the webContents API. I'm going to create the button click handler, which will send a message to the render process. This message will not have any additional data other than the type property.

// main.js

const onPerformDiff = () => {
  const window = BrowserWindow.getFocusedWindow();
  window.webContents.send('render-actions', { 
    type: 'performDiff' 
  });
};

Then, I'll attach it to the button I just made:

// main.js

function initTouchBar({ theme }) {
  const window = BrowserWindow.getFocusedWindow();

  const touchBar = new TouchBar({
    items: [
      new TouchBarButton({
        label: 'Perform Diff',
        backgroundColor: theme === 'light' ? '#FF9A49' : '#302AE6',
        click: onPerformDiff,
      })
    ]
  });

  window.setTouchBar(touchBar);
}

This means that once my orange (or purple) button is tapped on the Touch Bar, it will send the message { type: 'performDiff' } to the render process.

I can listen for that message on the render process and execute some JavaScript. A reminder of the listener we've already set up:

// render.js

ipcRenderer.on('render-actions', (event, data = {}) => {
  switch (data.type) {
    case 'performDiff':
      return window.globalPerformDiff();

    default:
      // no-op
  }
})

In this case, my app is a vanilla JavaScript app and I've saved a method globalPerformDiff to the window so I can easily access it:

// render.js

window.globalPerformDiff = () => {
  const button = document.getElementById('perform-diff-button');
  button.click();
};

I just need to trigger a click on the "Perform diff" button, which will propagate a form submission event, and my app will take care of the rest.

This is it! With everything we've gone over, you should be able to implement a Touch Bar button end-to-end.

I have put my Touch Bar initialization code in a separate method, initTouchBar, which in reality does more than this article covers. I am also calling it multiple times. Here are some examples of when I'm calling it:

  • when my app loads
  • when the user toggles between the light and dark UI themes
  • when the user performs a diff, where I conditionally show and hide buttons depending on some settings

The IPC lifecycle in Electron, visualized

Now that you've seen the code, let's go over the lifecycle of the interprocess communication events for this feature.

  1. from render: Send a message from the render process to the main process to create the Touch Bar for the specified UI theme
  2. from main: Configure the button with the theme colour using the Electron TouchBar API
  3. from main: Configure the button click handler of the above button to send a message to the render process using the Electron webContents API.
  4. on main: User interacts with the Touch Bar button
  5. from main: Send a message from the main process to the render process for the clicked button
  6. from render: Perform an action based on the message and its data

Conclusion

And that's how I would implement a macOS Touch Bar feature in an Electron app.

Interprocess communication in Electron is useful for implementing any and all native desktop functionality end-to-end, like custom desktop notifications, interacting with the camera, or more. Check out the Electron documentation for a list of available API's.

I hope you've learned something here today. Let me know about your experiences with IPC in Electron, and your experience with Electron in general.

And feel free to check out TextDiffer once it's launched! 🚀🚢