Posted on Leave a comment

Challenge: Focus on Focus APIs in SwiftUI

Image showing three text fields with one in focus

With device input — as with all things in life — where you put your focus matters. Focus can help people move through your app, whether they’re using the keyboard, Siri Remote, Apple Watch Digital Crown, or accessibility features — and you can make that experience even better with SwiftUI’s Focus APIs.

Begin the challenge

Our challenge to you: Find a part of your app where you can use the SwiftUI Focus APIs to fine-tune that interaction. That could include testing a great new tvOS implementation, polishing keyboard-driven navigation, or crafting a great accessibility experience.

Need support, or want help from the community as you explore the Focus APIs? You can share your progress in the Developer Forums.

Visit the Apple Developer Forums

Resources

Direct and reflect focus in SwiftUI

With device input — as with all things in life — where you put focus matters. Discover how you can move focus in your app with SwiftUI, programmatically dismiss the keyboard, and build large navigation targets from small views. Together, these APIs can help you simplify your app’s interface…

SwiftUI Accessibility: Beyond the basics

Go beyond the basics to deliver an exceptional accessibility experience. Learn how to use the new SwiftUI Previews in Xcode to explore the latest accessibility APIs and create fantastic, accessible apps for everyone. Find out how you can customize the automatic accessibility built into SwiftUI to…

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Memgraph Capture the Flag

Flag symbol on grey background

In the “Detect and diagnose memory issues” session at WWDC21, we explored how debugging memory problems can help improve your app’s performance, while “Symbolication: Beyond the basics” showcased debug symbols and how symbolication helps us connect the dots during code debugging. Now, it’s time to put those new skills to work.

If you like solving puzzles, you’re in the right place. One of our engineers has hidden a memory easter egg in our secret app. We’re trying to track it down but all we know is that it has format flag_<unknown_string_here>@WWDC. You’ll have to use the command line tools offered by macOS to investigate the memory issue, recover missing symbols, and and capture the rogue flag.

Begin the challenge

To get started, download the challenge .zip attached to this article and unzip the folder. We also have a message from our engineer to get you on the right track: “Memgraph is a special binary plist. What can you find in its properties?”

Download the Memgraph Capture the Flag Challenge material

And once you’ve explored the challenge, check out one solution to find the flag.

Challenge: Solution to “Memgraph Capture The Flag”

The “Memgraph Capture the Flag” challenge invites you to learn and practice memory debugging and symbolication with command line tools. If you haven’t yet attempted the challenge or otherwise don’t want to be spoiled on the necessary steps to complete it, we recommend returning to the…

You can solve these kinds of puzzles and track down memory issues in your own app, too. Try creating reference cycles in your app, saving a memgraph, and tracing them back to your source code. And for more debugging details, check out the WWDC21 sessions below.

Resources

Symbolication: Beyond the basics

Discover how you can achieve maximum performance and insightful debugging with your app. Symbolication is at the center of tools such as Instruments and LLDB to help bridge the layers between your application’s runtime and your source code. Learn how this process works and the steps you can take…

Detect and diagnose memory issues

Discover how you can understand and diagnose memory performance problems with Xcode. We’ll take you through the latest updates to Xcode’s tools, explore Metrics, check out the memgraph collection feature in XCTest, and learn how to catch regressions using a Performance XCTest.

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Create amazing documentation

Swift packages icon of brown box with swift icon on it

Explore Xcode’s new documentation features and learn how to add documentation to your own framework or package — or to your favorite open source project. For this challenge, we’re asking you to create documentation for your own framework or package (or your favorite open source project). Use Xcode 13 to build documentation from the header comments in your Swift framework, and add a Documentation Catalog to organize your content.

Begin the challenge

Open up your project in Xcode, and start adding documentation comments in your source by using Swift DocC markdown syntax. DocC uses the comments you write in your source code as the content for the documentation pages it generates. At a minimum, add basic documentation comments to the framework’s public symbols for DocC to use as their single-sentence abstracts or summaries. Here’s an example:

 public struct Sloth { public var species: Species

Once you’ve finished your documentation, select Product > Build Documentation to generate your source docs for Quick Help and the Developer Documentation window.

Xcode documentation window displaying information about sloths

Need help writing or constructing your documentation? You can share your progress on the Developer Forums.

Visit the Apple Developer Forums

Resources

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Voice Control Synonyms

Icon of speech bubble with question mark in it on purple background

Challenge yourself to make your app accessible through Voice Control and provide support for voice-based interaction. Voice Control is a feature built into iOS, iPadOS, and macOS, and empowers those who can’t use traditional input devices to control their Mac, iPhone, and iPad entirely with their voices. For people with motor limitations, having full voice control of their devices is truly transformative. People can gesture with their voices to click, swipe, and tap anywhere — they can do everything someone could do with a mouse or with touch. On iOS and iPadOS, Voice Control has the additional option to show Item Names, which place a name next to each tappable item. In this challenge, we’ll be making the “Show Names” experience better.

Voice Control in Podcasts on iOS: “Show names.”

Suppose that you create a button that looks like a paper airplane. What do you say to tap? “Tap send”? “Tap reply”? “Tap airplane”? In UIKit you can use the accessibilityUserInputLabels string array to respond to these prompts, while in SwiftUI you’d use the .accessibilityInputLabels modifier.

How to enable Voice Control

To use Voice Control, go to Settings > Accessibility > Voice Control. If it’s your first time enabling this setting, you’ll be asked to Set Up Voice Control and download a short file.

Once Voice Control has been set up, you can enable it in a few different ways:

  • You can ask Siri to turn Voice Control on or off for you at anytime.
  • You can use the Accessibility Shortcut in Settings > Accessibility and set the shortcut to Voice Control. Now, when you triple click the side button (or Home button, depending on your device), you can quickly turn Voice Control on or off.

Use Voice Control to interact with iPhone

Begin the challenge

We’re challenging you to make your app’s UI as easy to navigate by voice as possible and improve the Voice Control experience in your app. Start by turning on Voice Control by visiting Settings > Accessibility > Voice Control, and enable Overlay > Show Names.

Next, take a screenshot of your interface with the “Show names” overlay displaying on top of it. Explore what it’s like to navigate your app by Voice Control alone. What experience are you giving people right now? Are you struggling with any common tasks? How could you make it better?

Once you’ve spent some time with your app in Voice Control mode, it’s time to make some improvements. Here are a few tweaks you can make to your code to make your experience better for everyone.

Explore accessibilityInputLabels
First, you can implement accessibilityInputLabels to create short, concise labels that someone could easily speak by voice.

Button(action: { sendMessage = true
}) { Image(systemName: "paperplane") .font(.title) .accessibilityInputLabels(["send", "reply", "airplane"])
}

Tips:

  • Your primary string is the first string in the array, and will be the one that Voice Control shows on screen.
  • Brevity is key: use short, succinct words.
  • Localize your strings using NSLocalizedString and avoid special symbols in your labels.
  • When it comes to number of synonyms, add them judiciously. Limit the number of possible strings to a max of 4, as to not overload the recognition system.

You may have multiple elements in your UI that could be described the same way: One example is an image browser, where each image might be described as “Screenshot”. You can rely on Voice Control’s disambiguation feature in these cases to keep your label names short. When someone says “Screenshot”, a list of numbers will appear over all elements named “Screenshot” for someone to choose from.

Two screenshots with voice control labels, showing Voice Control’s disambiguation feature

Shorten label names
If your app already incorporates accessibilityLabel, you’ve done a lot of the work already — but your labels may be too long to speak! You can take advantage of accessibilityUserInputLabels (or, in SwiftUI, .accessibilityInputLabels) to keep the speakable label short, while leaving the valuable information your current accessibilityLabel conveys to an audience that relies on it.

two views with voice control labels shown, comparing “birthday plans edited 4 days ago” versus “birthday plans”

Share your experiences
As you add support for Voice Control to your app, share your implementation with the developer community. After you’ve made changes or improvements to your app, take another screenshot of your UI with the “Show names” overlay enabled. Share “before” and “after” screenshots on the Developer Forums. (And don’t forget to add alt text to your screenshot images on platforms that support it!)

Resources

Visit the Apple Developer Forums

Voice Control

accessibilityUserInputLabels

accessibilityInputLabels(_:)

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Create fun visual effects in Swift Playgrounds

Hammer symbol and paint brush symbol

Ever wonder how to make it seem like confetti is raining down from the sky? Or how to create a kaleidoscope effect using code? In this challenge, your goal is to create a compelling visual effect using the Shapes book from the Swift Playgrounds app. Maybe it’s a constellation of objects revolving in intriguing mathematical patterns, or a textural and fluid shape that adapts to your touch. It’s all up to you: What kind of visual effect would you like to dream up?

Begin the challenge

To get started, download and open Swift Playgrounds on your iPad or Mac, then select See All from the lower right corner to launch the Swift Playgrounds content screen. From here, you can find the Shapes book under “Starting Points” and download a copy to your device.

Swift Playgrounds app showing downloadable books and challenges

The Shapes starting point has some great examples to reference as you get started. Check out the page “Shape Graphics” to explore the book’s basic API for creating all shape types and placing them in the scene. “Touches and Animations” will show you how to apply animations to shapes and use touch events to drive behaviors. And finally “Sprite Shapes” can help you learn how to set up physics interactions between different shapes. From there, you’ll have all you need to create your own visual composition.

Want to show off your visual concept to the community? You can share your creation (or creation-in-progress) on the Developer Forums.

Visit the Apple Developer Forums

Resources

Download Swift Playgrounds for macOS

Learn more about Swift Playgrounds

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Optimize your app for 5G

Wi-Fi and LTE have long helped apps deliver connected experiences like video streaming, social networking, and online gaming — and 5G networking can provide even more opportunities to take advantage of high-bandwidth and low-latency connections in your app. Discover how you can optimize your existing app or build a new product from the ground up with 5G in mind so that you can move more data faster and deliver a great experience to people around the world.

Up your 5G game

One of the best ways you can take advantage of 5G’s high-performance characteristics is to offer multiple versions of an asset depending on someone’s network connection. If a person using your app is on a lower-bandwidth connection, your app can download or deliver files appropriate for that network; likewise, you can deliver higher-bandwidth assets when the connection supports it so that you can provide more information quickly. You can apply this to multiple different types of apps, including:

Streaming apps Video streaming apps can incorporate intelligent buffering and playback with AVFoundation to serve 4K (or 4K HDR) content to devices on 5G networks as well as lower-bandwidth options when connected to LTE.

Learn more about AVFoundation

Games Use the bandwidth available on 5G networks to deliver higher quality visuals and game play with larger texture maps and higher-poly models than you might otherwise on lower-bandwidth networks. Additionally, 5G’s lower-latency connection provides for faster overall play and state-saving actions between players and your server backend.

Learn more about SceneKit

Machine Learning apps If your app uses Core ML, you can improve both the speed and reliability of your on-device intelligence when connected to a 5G network by automatically retrieving larger .mlmodel and .mlarchive files from your server to run locally on someone’s device.

Core ML

AR apps While on 5G, you can provide a greater number of high-resolution objects within your ARKit scenes to provide a richer augmented experience for people interacting with your app. You can also use the extra bandwidth available over 5G networks to share even larger ARWorldMap and ARPointCloud objects in a shared AR experience — for instance, working collaboratively to lay out physical spaces with virtual objects from your app.

ARKit

Tune your transfers

Apple networking APIs automatically provide optimized management and performance for each platform and network type. In addition, you can further optimize your app to address potential cellular issues like movement speed and direction, cellular infrastructure demand, and interference.

Forget the network Because 5G networks typically offer better performance than Wi-Fi, it’s up to you to decide how your app best utilizes network resources — and you no longer need to rely on overall network type (cellular or Wi-Fi) to do so. Instead, you can use Constrained and Expensive to describe various network states. Each of these states relies on information from a person’s Data Mode choices (as defined in Settings > Cellular > Cellular Data Options) as well as their cellular plan restrictions.

isExpensive

isConstrained

For example, the network automatically switches to Constrained when someone enables Low Data Mode. When a person’s network is listed as Constrained, your app should minimize network data usage regardless of the value of Expensive. If the network is Expensive but not Constrained, your app should be considerate when fetching network resources while not imposing strict restraints. If the network is neither Constrained nor Expensive, your app can focus on providing the highest quality experience with minimal consideration for data usage.

Each networking framework uses the Constrained and Expensive indicators in specific ways. When using URLRequest, for example, your app can indicate which resource should be retrieved by setting the appropriate value on the allowsConstrainedNetworkAccess and allowsExpensiveNetworkAccess properties. In contrast, when using NWConnection, your app can access the state of the network through stateUpdateHandler as the isConstrained isExpensive properties of your connection’s currentPath. And, if your app uses AVFoundation instead of the Network framework or URLRequest, there are similar keys including AVURLAssetAllowsConstrainedNetworkAccessKey and AVURLAssetAllowsExpensiveNetworkAccessKey.

Regardless of technique, remember that the values for Constrained and Expensive are transient and can change as someone moves from one type of network connection to another. If your app dynamically monitors these changes, you’ll always provide the best experience for people, no matter their connection.

allowsExpensiveNetworkAccess

prohibitExpensivePaths

Provide a fallback Unless your app is designed specifically for a network with guaranteed performance characteristics, like a corporate or private connection, you should always make sure it functions well — even when there’s no network available at all. When someone initially downloads your app, make sure it delivers acceptable-quality assets as part of the bundle. If the app has periods of fast connectivity, you can then download higher-quality files and store them locally to ensure they’re available when someone leaves network range or goes offline entirely.

Support your surroundings Most cellular providers have prioritized 5G rollouts in high-density areas: entertainment venues such as sports stadiums and amphitheaters, transportation hubs like train stations and airports, centers for business and education, and points of interest like public parks and tourist landmarks. When people recognize they’re in a high-performance networking location, they may want to explicitly enable caching and other features in your app before heading to a destination with reduced coverage — consider incorporating interface elements that notify and enable people to immediately download any relevant content

Take advantage of built-in frameworks Apple’s hardware and on-device frameworks are tuned to deliver advanced functionality in a power-efficient manner. For example, you can use Core ML for on-device intelligence instead of client-server round trips, or ARKit and the Vision framework for capturing, processing, and presenting insightful information in the field. On-device processing minimizes the need to exchange large amounts of data — let alone potentially personally identifiable information — and eliminates the need for having to connect to a back-end process in order to provide a useful service in your app.

When you do need to move large amounts of data, you can lean on Smart Data Mode for 5G-enabled devices. This feature monitors your app’s state along with any Apple frameworks you’re currently using to automatically switch between existing cellular frequencies in a manner that ensures your app receives the highest possible bandwidth — all without sacrificing battery life.

For example, when an app is in the foreground and playing video using the AVFoundation framework, Smart Data Mode ensures that high-bandwidth 5G is enabled. In addition, Smart Data Mode monitors the streaming experience while someone is connected to a 5G network. If the stream is throttled due to traffic shaping — either by the cellular provider or the limitations of your CDN — the feature will identify the throttled throughput and move the stream to an LTE connection to conserve power. Background requests for data using the core networking frameworks can be served just as well over LTE or lower power frequencies.

Get out there

Previously, testing your app’s networking code involved toggling the network state, switching between Wi-Fi and cellular data, and then using a network conditioner and other tools to alter various characteristics. While this is still a great way to test for basic use cases, nothing beats getting out and exploring the edge-case scenarios only a deployed network can throw your way.

Start small The App Store has a variety of apps you can use to determine the network characteristics for a given area. With one of these apps and a regional carrier’s wireless coverage map, you can track down the perfect spots in your area to ensure your app is selecting the correct resources at the right time. And once you’ve found that perfect 5G networking spot, move to another where your coverage is sub-optimal and check your app. Did it keep running? Did your streaming content degrade as expected or move to local resources? Did it deliver acceptable-quality assets after fresh install? The more real-world use cases you can test in advance, the better the experience overall will be for people around the world.

Go big While many third-party websites provide performance data for cellular networks, they are an aggregate statistic and only an approximation of the performance at the location where someone might be using your app. Because this data is only valuable as a baseline, it’s not a substitute for knowing how your app performs in the wild. You can use Testflight for iOS to scale your beta testing to people and networks around the world. You may also want to consider creating a TestFlight group — not only to ensure your app is bug-free, but that it performs well based on the overall network characteristics of the cohort you’ve assembled.

Move to the edge While the location of your server isn’t something you can always control, at least try to influence or decide where your server infrastructure resides and take steps to minimize the distance between your server and your app. When you reduce the distance between people and your backend, it can vastly improve the network performance of your app. One way you can improve your own app’s connection is by selecting a hosting provider that can federate your server back-end to map closely to the cellular networks your app uses. Alternatively, you may also want to consider using a few strategically-located CDNs.

Reach out

5G networks provide a real opportunity to enhance your existing app with richer data or build entirely new experiences that were previously not possible. If youʼre working on creating an amazing experience with 5G and would like to share it with us, let us know.

Contact us

Learn more about supporting 5G in your apps

Posted on Leave a comment

WWDC21 Daily Digest: Day 4

A Memoji staring into an open MacBook Pro

Welcome to day 4 of WWDC, or — as we’re calling it — Apple Design Awards day! We’ve got a fresh round of session videos, labs, challenges and some hardware to hand out later on, as well as lots of other fun activities in our pavilions and digital lounges. Read on.

And the Apple Design Award winners are…

… being announced this afternoon! Stream the live presentation of the Apple Design Awards starting at 2 p.m. PDT. (Virtual rounds of applause will be accepted.)

WWDC21 Apple Design Awards

WWDC21 Apple Design Awards (ASL)

Get all caught up

Missed any of the fun this week? No worries: Our official recap videos will get you caught up in no time.

Wednesday@WWDC21

Tuesday@WWDC21

Monday@WWDC21

Day 4 in the WWDC pavilions

Another set of great sessions, labs, and activities have arrived in the pavilions: Try out a Framework Freestyle in the Essentials pavilion and learn a new framework in 100 lines of code or less. Discover how to design memorable SharePlay experiences in the Audio and Video pavilion. Sign up for one of Friday’s design labs in the Design pavilion. And get a glimpse of a magnificent future without passwords in the Privacy & Security pavilion.

Design for spatial interaction

Design for Group Activities

Discover rolling clips with ReplayKit

Create image processing apps powered by Apple Silicon

Optimize high-end games for Apple GPUs

Challenge: Framework Freestyle

Build Mail app extensions

Swift concurrency: Behind the scenes

Learn to meditate (even if you’re fidgety)

At 11 a.m. PDT, hear from special guest speaker Dan Harris, an Emmy-winning journalist, Good Morning America anchor and author of the best-selling book Meditation for Fidgety Skeptics. After having a nationally televised panic attack in 2004, Harris found himself on a long and often bizarre journey that ended with the discovery of mindfulness meditation. Today, Harris will discuss his journey, as well as the books, podcast, and app that have helped millions manage the stress and anxieties of today’s world—including previous non-believers like himself. (Want a sneak peek? Check out Harris’s app Ten Percent Happier.)

Meditation for fidgety skeptics

Lock down a lab appointment

There’s still one more day to register for a lab appointment with Apple engineers, designers, and specialists for 1-to-1 guidance and conversation.

Until Day 5…

That’s it for today! But rest up — we’ve got one more big day for you tomorrow.

Posted on Leave a comment

Challenge: Framework Freestyle

Framework icon with a question mark on top of it on a yellow background

No matter your level of expertise, it can be daunting to step out of your comfort zone when you’re first learning about new frameworks or technologies. Our challenge today presents a fun and interactive way to encourage you to try something new with an ARKit sample app and one framework of your choosing. What can you create in 100 lines of code or less?

Begin the challenge

This challenge is a gamified Augmented Reality experience created with RealityKit and ReplayKit. To participate, you’ll need to have downloaded the developer beta for iOS 15 and Xcode 13. After you do, download the Framework Freestyle sample project from this challenge and open it in Xcode, then build and run the app on your iOS device.

When you engage with the app, it triggers a mystery sequence of Apple frameworks, randomly selecting one of them. Here comes the fun part: We’re asking you to build something new using whatever framework the randomizer lands on — and do so using 100 lines of code or less! For example, if it lands on SwiftUI, you could experiment in Xcode with the canvas, or try making a basic search bar with .searchable. Don’t worry too much about building something perfect: Use this challenge to break the ice, learn, and have fun.

Resources

WWDC21 Challenge: Framework Freestyle

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Sense the world around you in Swift Playgrounds

Hammer symbol and paint brush symbol

Just like our senses, our devices constantly gather data from their environment, and can use that information to do interesting and important things. In this challenge, you’ll harness your iPad’s sensor data to create a visualization or experience of your choosing.

Begin the challenge

To get started, download and open Swift Playgrounds on your iPad, then select See All from the lower right corner to launch the Swift Playgrounds content screen. From here, you can find the Sensor Create book under “Starting Points” and download a copy to your device.

Swift playgrounds content on iPad

In this challenge, you’ll use the Sensor Create playground book to gather some data from the world around you. You can use audio data (frequency, volume) from the microphone, light data (color, brightness) from the camera, and also gyroscope data (movement in X, Y, Z coordinates) from the motion sensor of the device. The book has some great reference material to help you get started: Check out “Using Device Motion,” “Using Light to Play Sound,” “Clappy Fish,” and “Synesthesia.”

Think about the types of things you could decipher about your environment based upon this sensor data. Given this, how can you write some code that visualizes this information in interesting ways? For example, you could create an alarm that goes off when the volume around you is too high, or create a notification that displays when you quickly accelerate or decelerate, asking if you or your device has fallen. These are just examples: Use your imagination and come up with an idea you love!

Resources

Download Swift Playgrounds for iOS

Learn more about Swift Playgrounds

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Design a quiz in Swift Playgrounds

Hammer symbol and paint brush symbol

Do the people in your life know your favorite animal noise? How about the book that changed your life? Your least favorite pizza topping? Design a “How well do you know me?” quiz to see who knows the most about your quirks and interests.

Begin the challenge

This challenge invites you to create a quiz using the Answers book in Swift Playgrounds. Your “How do you know me” quiz should ask a series of questions and ultimately return a score (and possibly a colorful description!) based upon how many questions were answered correctly.

To get started, download and open Swift Playgrounds on your iPad or Mac, then select See All from the lower right corner to launch the Swift Playgrounds content screen. From here, you can find the Answers book under “Starting Points” and download a copy to your device.

Use the Swift Playgrounds app to download the Answers book for this challenge.

Use the Swift Playgrounds app to download the Answers book for this challenge.

The Answers starting point contains a page called “API Overview,” which dives into the API for this playground. You can use the show call to display text or images, and use several different ask calls to request feedback from the player and store their response as variables. You can use the combination of these API calls to build up your own custom quiz questions and check responses against your own answer key.

You can use Answers to build all sorts of different text-based trees and quizzes, and if you feel strongly about making a different type of quiz, please do! We highly encourage you to explore different ways of using this starting point to make something that you’re excited about. And if you’d like to share what you’ve built with the community, post a video of your quiz in action or share a link in the Developer Forums.

Visit the Apple Developer Forums

Resources

Download Swift Playgrounds for macOS

Learn more about Swift Playgrounds

Read the WWDC21 Challenges Terms and Conditions