Automate screenshots generation for iOS apps with Fastlane and framer

5 minute read

Once your app is on the AppStore, you need to take great screenshots of it so that users are willing to try it out. If your app design changes a lot or you’re planning to localize it in many languages, it will take you a lot of times just to grab those screenshots. That’s why we, at Spreaker, decided to automate this heavy task. Here it is how we did it!

Our flagship Radio app is fully localized in 3 languages (english, spanish and italian). It support all iOS devices, both iPhones (from the smallest 4s with 3.5” display up to the 6s-Plus with 5.5”) and iPads (7”, 9” and 13”).

Because we want to showcase 4 screens of our app, it means we need to take 4 screens per device, per language. In total we’re talking about 72 screenshots!

And this task needs to be done from scratch every time we change the UI for adding new features or improving the looks of it. How crazy is that?

Let’s introduce Snapshot (from Fastlane)

Thanks to snapshot, a tool part of fastlane, taking the screenshots is quite easy. It allows you to iterate over a list of devices (it uses the iOS simulator) and languages and save screenshots file locally.

The complex part comes when you want to control which screen of the app opens (in order to take screenshot of them) and which data to use to showcase the best of that screen.

Open a specific screen

To simplify the “search” of an element on screen to “touch it”, we use accessibilityIdentifier on those elements so that in our UI tests we can find them easily. = XCUIApplication()


// Select explore section
let toolbarsQuery =

// Open subsection["see_all"].tap()

explore_section and see_all are accessibility identifier so they won’t change when the device language change and are not visible to the user!

Populate app with data

Our Radio app is REST API based so to control what it displays, we need to feed it with some API responses. To make this sustainable, we let the app requests data from a local webserver (running on localhost) so we can keep data and UI tests in the same place.

To switch the api base address in the app, we pass a parameter to the app process so that, in the app, we can check for it and switch from production to local address.

We do this in the setUp function in the test.

let app = XCUIApplication()
app.launchEnvironment = ["Screenshots": "1"]

In the app, where we configure the API base url to call, we check for the Screenshots key.

static func isUITest() -> Bool {
    return NSProcessInfo.processInfo().environment["Screenshots"] != nil
    return false

Be smart and safe. Ensure this override will affect ONLY the DEBUG builds and not the ones for the AppStore!

Do the last mile with framer

Now that we have all the real screenshots, we can do one more step and put them inside a beautiful frame so it will looks much nicer on the AppStore.

Fastlane comes with frameit, a tool that does exactly that but our designer wanted more control other the screenshots so we build a custom fastlane action to do it. We call it framer.

Framer needs 2 things to work: the screenshots taken with snapshot and some templates.

A template is an image that will be use as background for the final screenshot, plus some simple configuration data. It basically needs to know where to overlay the screenshot over the template image and at what size scale down the screenshot to fit it in the template.

The configuration is a simple json file named Config.json

  "default": {
    "text": {
      "color": "#545454",
      "font": "SF-UI-Display-Thin.otf",
      "padding": 20
    "image": {
      "offset": "+0+0",
      "width": 0

  "iPhone4s": {
    "image": {
      "offset": "+157+171",
      "width": 330
    "text": {
      "offset_y": 804,
      "height": 160,
      "size": 44

iPhone4s is the name of a template (you can guess the size of it). It has to match with the name of the file.

framer supports also localized text that will be put over the template, in the position defined in the Config.json file.

The text has to be written inside a text.json file and saved inside the same folder that contains the real screenshots to process. It’s a localized folder so we have the same file translated for each language.

	"Explore": "Browse original content created\nby thousand of podcaster.",
	"Player": "Like and share episodes\nand leave comment.",
	"Show": "Seek through episodes of\nyour favorite podcast.",
	"Chat": "Chat with hosts as they go live,\nand interact with other listeners."

Explore, Player, Show and Chat are keywords used in the screenshots file names. Of course those are used as part of the string passed to snapshot (see invocation above).


Let’s run our UI tests for taking screenshots.

In 12 minutes our iOS Radio app is built and executed on 6 different devices with 3 different languages each and, for each combination, 4 screenshots are taken.

72 screenshots are then framed by framer in less than 2 minutes.

In the same time I take a coffee break and chat with a colleague, we have a new set of screenshots ready to be uploaded to the AppStore. With no interaction from me or review/editing of our designer.

Super cool, ah?

Last but not least, to upload all these shiny new framed screenshots we use deliver (part of Fastlane too).

These are our involved lanes inside Fastfile

desc "Take screenshots of the app"
lane :take_screenshots do

  # Capture screens
    skip_open_summary: true

  # Frame them


desc "Upload all screenshots for AppStore build"
lane :upload_screenshots do

  # Upload metadata only
    app_version: "" + `cd .. && agvtool what-marketing-version -terse1 | tr -d '\n'`,
    skip_metadata: true,
    skip_binary_upload: true

Where can you get it?

framer has been published the sourcecode on and the gem on so you can use it by running the command

fastlane add_plugin framer

If you have issue or any feedback for improving it, open an issue on Github or drop me a line on Twitter.