Posted on Leave a comment

Challenge: Explore spatial audio soundscapes

Person SF Symbol surrounded by speaker SF Symbols on a blue-green textured background

When you support multichannel audio in your app, game, or audiovisual content, you can add dimensionality to your storytelling, transport listeners out of a conventional audio experience, and bring people further into the worlds you’re creating. In this challenge, we invite you to explore a soundscape in intentional stereo, stereo spatialization, and multichannel formats, consider the differences between each, and brainstorm how spatial audio could enhance your own stories.

Begin the challenge

In the demo for “Immerse your app in spatial audio,” the presenter, Simon, has his spatial audio demonstration video ‘fail’ on him. After a few moments of panic and quick thinking, he decides he’s going to describe the events of this failed video — and as he does, those events begin to come to life. In each scene, different audio techniques create a fully-imagined environment for the audience to inhabit, drawing them into Simon’s described world.

Immerse your app in spatial audio

Discover how spatial audio can help you provide a theater-like experience for media in your apps and on the web. We’ll show you how you can easily bring immersive audio to those listening with compatible hardware, and how to automatically deliver different listening experiences depending on…

For this challenge, we’re inviting you to explore this demo and listen to the differences as you move between intentional stereo, spatialized stereo, and multichannel formats. You’ll need a pair of spatial audio-compatible accessories, like AirPods Pro, as well as a device running iOS 14 or macOS Big Sur. To explore the spatialized stereo format, you’ll need a device running iOS 15 or macOS Monterey. As you listen, consider how each version of the experience feels different.

Listen to the demo in intentional stereo
Watch the session on iOS 14 or macOS Big Sur in the Developer app or on developer.apple.com. You’ll hear this demo delivered in stereo, with visual cues on screen indicating the full surround experience.

What do you notice, listening to this experience? How does the audio shift and change as we move through each of Simon’s soundscapes?

Listen to the demo for “Immerse your app in spatial audio” in the Developer app

Listen to the demo in spatialized stereo
To hear the spatialized stereo version of “Immerse your app in spatial audio,” watch this demo on a device running the iOS 15 or macOS Monterey developer betas.

How does this change the experience for the listener compared to the intentional stereo version? Are there sounds or moments you find yourself noticing that you haven’t before?

Listen to the demo for “Immerse your app in spatial audio” in the Developer app

Listen to the demo in multichannel
You can stream a full multichannel mix of this demo in spatial audio on either iOS 14 or iOS 15 using the link below.

As you listen to this multichannel mix, divide the world into three distances: Background, foreground, and personal space. Think about where the sounds are coming from. What do these different distances and orientations add to the experience? How does the positioning of sound further enhance this story?

Listen to the demo for “Immerse your app in spatial audio” in multichannel audio

Now, think about a moment from your app, game or audiovisual content. How does the audio currently sound in this moment? How can you improve that experience by bringing distance and orientation into your soundscape? Where could supporting multichannel and spatial audio help draw people into your story?

Spatialize your experiences
With more support for spatial audio experiences coming with iOS 15 and macOS Monterey later this year, now is a great time to consider how you currently approach telling stories through audio. Listen to your stereo experiences on the developer betas and listen to how they change when played back in spatialize stereo. Consider how you can augment those existing moments by creating multichannel mixes. And explore what kinds of entirely new experiences you can provide listeners with spatial audio. For game designers, we also recommend watching “Discover geometry-aware audio with the Physical Audio Spatialization Engine (PHASE)” to learn about new ways you can create spatial audio experiences that adapt to the environment of your game.


Note: if you’re curious about experimenting further with multichannel, you can try creating a multichannel audio experience of your own by exploring the binaural and surround mixing tools in digital audio workstations (DAWs) like Logic Pro X. (To properly mix and play back your audio in surround, however, you’ll need a multichannel speaker setup.)


Want to share your thoughts on this demo, or how you’d adapt your existing experiences for spatial audio? Head over to the Developer Forums.

Visit the Apple Developer Forums

Resources

Immerse your app in spatial audio

Discover how spatial audio can help you provide a theater-like experience for media in your apps and on the web. We’ll show you how you can easily bring immersive audio to those listening with compatible hardware, and how to automatically deliver different listening experiences depending on…

Discover geometry-aware audio with the Physical Audio Spatialization Engine (PHASE)

Explore how geometry-aware audio can help you build complex, interactive, and immersive audio scenes for your apps and games. Meet PHASE, Apple’s spatial audio API, and learn how the Physical Audio Spatialization Engine (PHASE) keeps the sound aligned with your experience at all times — helping…

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Achievement Unlocked — Series Finale

Achievement symbol with an achievement icon that looks like a star in a circle.

When you create achievements that truly surprise and delight your players as they make their way through a game, you can help elicit a feeling of accomplishment, or even make the player laugh. Most of us have experienced playing games where we have unlocked some kind of achievement or trophy that stirs some of these emotions. This is exactly what we want you to consider when you create Game Center achievements for your apps or games — and now, we’re challenging you to show the developer community your best, funniest, strangest, and most delightful achievements.

Begin the challenge

This achievements challenge focuses on sequencing achievements to encourage people to complete a specific set of tasks. Progress-based achievements are the most common types of rewards players can earn, and coupling some of them together can help create an even stronger narrative within your app or game.

Whether your game is divided into chapters, levels, or some other way, today’s challenge is to create a series of at least 2-3 achievements that break up the narrative in a unique and interesting way. Think about how you might add titles and descriptions in both the locked and unlocked versions of the achievement that indicate to the player these are connected together and need to be unlocked in that specific order. (And as always, bonus points for puns and amazing alliteration.)

We welcome all achievements, new, old, existing, or imaginary: Show off your “Series Finale” achievements from one of your existing apps or games, or put your wordsmithery to work and create an entirely new set of achievements. You can share these with the developer community on the Developer Forums.

Best of all, we’ve made it easy for you to participate and dream up awesome achievements even if you haven’t yet implemented Game Center or you want to try writing something entirely new: Just download the attached Game Center achievement template.

Visit the Apple Developer Forums

Best practices for great achievements

It’s a lot of fun to create unique and engaging achievements to connect people with your app or game. Below are a few of our recommendations when thinking about writing and designing achievements.

Be creative with an achievement’s title, but straightforward with its description
Although most people appreciate entertaining titles, they expect an achievement’s description to specify how to earn it. If you were to create a WWDC21 achievement, for instance, you might write the following:

Title: Code Completionist
Description: Watched every WWDC21 Code-Along session.

Be succinct
The Game Center achievement card limits your title and description to two lines each before truncating the text — brevity is key to a great achievement.

Think inclusively
Follow the Human Interface Guidelines around inclusivity when creating achievements. The best jokes, puns, and wordplay are those that are intuitive and friendly to everyone who might interact with your app or game, and make players feel recognized and rewarded.

Add unique, high-quality images
People appreciate earning unique achievements that remind them of each accomplishment. When you create custom artwork, you can help that achievement stand out from the others in your app or game and make it more recognizable to people who interact with it.

You can learn more about how to design great achievements in Apple’s Human Interface Guidelines, and in the WWDC20 session “Design for Game Center.”

Design for Game Center

Get your game’s interface ready for Game Center. We’ll show you how to deliver personalized touches to the GameKit interface that provide a rich experience for players, with features like achievements, leaderboards, and multiplayer gaming. Learn how to customize your game’s access point, design…

Download the Achievement Unlocked Challenge material

Learn more about designing achievements

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Focus on Focus APIs in SwiftUI

Image showing three text fields with one in focus

With device input — as with all things in life — where you put your focus matters. Focus can help people move through your app, whether they’re using the keyboard, Siri Remote, Apple Watch Digital Crown, or accessibility features — and you can make that experience even better with SwiftUI’s Focus APIs.

Begin the challenge

Our challenge to you: Find a part of your app where you can use the SwiftUI Focus APIs to fine-tune that interaction. That could include testing a great new tvOS implementation, polishing keyboard-driven navigation, or crafting a great accessibility experience.

Need support, or want help from the community as you explore the Focus APIs? You can share your progress in the Developer Forums.

Visit the Apple Developer Forums

Resources

Direct and reflect focus in SwiftUI

With device input — as with all things in life — where you put focus matters. Discover how you can move focus in your app with SwiftUI, programmatically dismiss the keyboard, and build large navigation targets from small views. Together, these APIs can help you simplify your app’s interface…

SwiftUI Accessibility: Beyond the basics

Go beyond the basics to deliver an exceptional accessibility experience. Learn how to use the new SwiftUI Previews in Xcode to explore the latest accessibility APIs and create fantastic, accessible apps for everyone. Find out how you can customize the automatic accessibility built into SwiftUI to…

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Memgraph Capture the Flag

Flag symbol on grey background

In the “Detect and diagnose memory issues” session at WWDC21, we explored how debugging memory problems can help improve your app’s performance, while “Symbolication: Beyond the basics” showcased debug symbols and how symbolication helps us connect the dots during code debugging. Now, it’s time to put those new skills to work.

If you like solving puzzles, you’re in the right place. One of our engineers has hidden a memory easter egg in our secret app. We’re trying to track it down but all we know is that it has format flag_<unknown_string_here>@WWDC. You’ll have to use the command line tools offered by macOS to investigate the memory issue, recover missing symbols, and and capture the rogue flag.

Begin the challenge

To get started, download the challenge .zip attached to this article and unzip the folder. We also have a message from our engineer to get you on the right track: “Memgraph is a special binary plist. What can you find in its properties?”

Download the Memgraph Capture the Flag Challenge material

And once you’ve explored the challenge, check out one solution to find the flag.

Challenge: Solution to “Memgraph Capture The Flag”

The “Memgraph Capture the Flag” challenge invites you to learn and practice memory debugging and symbolication with command line tools. If you haven’t yet attempted the challenge or otherwise don’t want to be spoiled on the necessary steps to complete it, we recommend returning to the…

You can solve these kinds of puzzles and track down memory issues in your own app, too. Try creating reference cycles in your app, saving a memgraph, and tracing them back to your source code. And for more debugging details, check out the WWDC21 sessions below.

Resources

Symbolication: Beyond the basics

Discover how you can achieve maximum performance and insightful debugging with your app. Symbolication is at the center of tools such as Instruments and LLDB to help bridge the layers between your application’s runtime and your source code. Learn how this process works and the steps you can take…

Detect and diagnose memory issues

Discover how you can understand and diagnose memory performance problems with Xcode. We’ll take you through the latest updates to Xcode’s tools, explore Metrics, check out the memgraph collection feature in XCTest, and learn how to catch regressions using a Performance XCTest.

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Create amazing documentation

Swift packages icon of brown box with swift icon on it

Explore Xcode’s new documentation features and learn how to add documentation to your own framework or package — or to your favorite open source project. For this challenge, we’re asking you to create documentation for your own framework or package (or your favorite open source project). Use Xcode 13 to build documentation from the header comments in your Swift framework, and add a Documentation Catalog to organize your content.

Begin the challenge

Open up your project in Xcode, and start adding documentation comments in your source by using Swift DocC markdown syntax. DocC uses the comments you write in your source code as the content for the documentation pages it generates. At a minimum, add basic documentation comments to the framework’s public symbols for DocC to use as their single-sentence abstracts or summaries. Here’s an example:

 public struct Sloth { public var species: Species

Once you’ve finished your documentation, select Product > Build Documentation to generate your source docs for Quick Help and the Developer Documentation window.

Xcode documentation window displaying information about sloths

Need help writing or constructing your documentation? You can share your progress on the Developer Forums.

Visit the Apple Developer Forums

Resources

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Voice Control Synonyms

Icon of speech bubble with question mark in it on purple background

Challenge yourself to make your app accessible through Voice Control and provide support for voice-based interaction. Voice Control is a feature built into iOS, iPadOS, and macOS, and empowers those who can’t use traditional input devices to control their Mac, iPhone, and iPad entirely with their voices. For people with motor limitations, having full voice control of their devices is truly transformative. People can gesture with their voices to click, swipe, and tap anywhere — they can do everything someone could do with a mouse or with touch. On iOS and iPadOS, Voice Control has the additional option to show Item Names, which place a name next to each tappable item. In this challenge, we’ll be making the “Show Names” experience better.

Voice Control in Podcasts on iOS: “Show names.”

Suppose that you create a button that looks like a paper airplane. What do you say to tap? “Tap send”? “Tap reply”? “Tap airplane”? In UIKit you can use the accessibilityUserInputLabels string array to respond to these prompts, while in SwiftUI you’d use the .accessibilityInputLabels modifier.

How to enable Voice Control

To use Voice Control, go to Settings > Accessibility > Voice Control. If it’s your first time enabling this setting, you’ll be asked to Set Up Voice Control and download a short file.

Once Voice Control has been set up, you can enable it in a few different ways:

  • You can ask Siri to turn Voice Control on or off for you at anytime.
  • You can use the Accessibility Shortcut in Settings > Accessibility and set the shortcut to Voice Control. Now, when you triple click the side button (or Home button, depending on your device), you can quickly turn Voice Control on or off.

Use Voice Control to interact with iPhone

Begin the challenge

We’re challenging you to make your app’s UI as easy to navigate by voice as possible and improve the Voice Control experience in your app. Start by turning on Voice Control by visiting Settings > Accessibility > Voice Control, and enable Overlay > Show Names.

Next, take a screenshot of your interface with the “Show names” overlay displaying on top of it. Explore what it’s like to navigate your app by Voice Control alone. What experience are you giving people right now? Are you struggling with any common tasks? How could you make it better?

Once you’ve spent some time with your app in Voice Control mode, it’s time to make some improvements. Here are a few tweaks you can make to your code to make your experience better for everyone.

Explore accessibilityInputLabels
First, you can implement accessibilityInputLabels to create short, concise labels that someone could easily speak by voice.

Button(action: { sendMessage = true
}) { Image(systemName: "paperplane") .font(.title) .accessibilityInputLabels(["send", "reply", "airplane"])
}

Tips:

  • Your primary string is the first string in the array, and will be the one that Voice Control shows on screen.
  • Brevity is key: use short, succinct words.
  • Localize your strings using NSLocalizedString and avoid special symbols in your labels.
  • When it comes to number of synonyms, add them judiciously. Limit the number of possible strings to a max of 4, as to not overload the recognition system.

You may have multiple elements in your UI that could be described the same way: One example is an image browser, where each image might be described as “Screenshot”. You can rely on Voice Control’s disambiguation feature in these cases to keep your label names short. When someone says “Screenshot”, a list of numbers will appear over all elements named “Screenshot” for someone to choose from.

Two screenshots with voice control labels, showing Voice Control’s disambiguation feature

Shorten label names
If your app already incorporates accessibilityLabel, you’ve done a lot of the work already — but your labels may be too long to speak! You can take advantage of accessibilityUserInputLabels (or, in SwiftUI, .accessibilityInputLabels) to keep the speakable label short, while leaving the valuable information your current accessibilityLabel conveys to an audience that relies on it.

two views with voice control labels shown, comparing “birthday plans edited 4 days ago” versus “birthday plans”

Share your experiences
As you add support for Voice Control to your app, share your implementation with the developer community. After you’ve made changes or improvements to your app, take another screenshot of your UI with the “Show names” overlay enabled. Share “before” and “after” screenshots on the Developer Forums. (And don’t forget to add alt text to your screenshot images on platforms that support it!)

Resources

Visit the Apple Developer Forums

Voice Control

accessibilityUserInputLabels

accessibilityInputLabels(_:)

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Create fun visual effects in Swift Playgrounds

Hammer symbol and paint brush symbol

Ever wonder how to make it seem like confetti is raining down from the sky? Or how to create a kaleidoscope effect using code? In this challenge, your goal is to create a compelling visual effect using the Shapes book from the Swift Playgrounds app. Maybe it’s a constellation of objects revolving in intriguing mathematical patterns, or a textural and fluid shape that adapts to your touch. It’s all up to you: What kind of visual effect would you like to dream up?

Begin the challenge

To get started, download and open Swift Playgrounds on your iPad or Mac, then select See All from the lower right corner to launch the Swift Playgrounds content screen. From here, you can find the Shapes book under “Starting Points” and download a copy to your device.

Swift Playgrounds app showing downloadable books and challenges

The Shapes starting point has some great examples to reference as you get started. Check out the page “Shape Graphics” to explore the book’s basic API for creating all shape types and placing them in the scene. “Touches and Animations” will show you how to apply animations to shapes and use touch events to drive behaviors. And finally “Sprite Shapes” can help you learn how to set up physics interactions between different shapes. From there, you’ll have all you need to create your own visual composition.

Want to show off your visual concept to the community? You can share your creation (or creation-in-progress) on the Developer Forums.

Visit the Apple Developer Forums

Resources

Download Swift Playgrounds for macOS

Learn more about Swift Playgrounds

Read the WWDC21 Challenges Terms and Conditions

Posted on Leave a comment

Optimize your app for 5G

Wi-Fi and LTE have long helped apps deliver connected experiences like video streaming, social networking, and online gaming — and 5G networking can provide even more opportunities to take advantage of high-bandwidth and low-latency connections in your app. Discover how you can optimize your existing app or build a new product from the ground up with 5G in mind so that you can move more data faster and deliver a great experience to people around the world.

Up your 5G game

One of the best ways you can take advantage of 5G’s high-performance characteristics is to offer multiple versions of an asset depending on someone’s network connection. If a person using your app is on a lower-bandwidth connection, your app can download or deliver files appropriate for that network; likewise, you can deliver higher-bandwidth assets when the connection supports it so that you can provide more information quickly. You can apply this to multiple different types of apps, including:

Streaming apps Video streaming apps can incorporate intelligent buffering and playback with AVFoundation to serve 4K (or 4K HDR) content to devices on 5G networks as well as lower-bandwidth options when connected to LTE.

Learn more about AVFoundation

Games Use the bandwidth available on 5G networks to deliver higher quality visuals and game play with larger texture maps and higher-poly models than you might otherwise on lower-bandwidth networks. Additionally, 5G’s lower-latency connection provides for faster overall play and state-saving actions between players and your server backend.

Learn more about SceneKit

Machine Learning apps If your app uses Core ML, you can improve both the speed and reliability of your on-device intelligence when connected to a 5G network by automatically retrieving larger .mlmodel and .mlarchive files from your server to run locally on someone’s device.

Core ML

AR apps While on 5G, you can provide a greater number of high-resolution objects within your ARKit scenes to provide a richer augmented experience for people interacting with your app. You can also use the extra bandwidth available over 5G networks to share even larger ARWorldMap and ARPointCloud objects in a shared AR experience — for instance, working collaboratively to lay out physical spaces with virtual objects from your app.

ARKit

Tune your transfers

Apple networking APIs automatically provide optimized management and performance for each platform and network type. In addition, you can further optimize your app to address potential cellular issues like movement speed and direction, cellular infrastructure demand, and interference.

Forget the network Because 5G networks typically offer better performance than Wi-Fi, it’s up to you to decide how your app best utilizes network resources — and you no longer need to rely on overall network type (cellular or Wi-Fi) to do so. Instead, you can use Constrained and Expensive to describe various network states. Each of these states relies on information from a person’s Data Mode choices (as defined in Settings > Cellular > Cellular Data Options) as well as their cellular plan restrictions.

isExpensive

isConstrained

For example, the network automatically switches to Constrained when someone enables Low Data Mode. When a person’s network is listed as Constrained, your app should minimize network data usage regardless of the value of Expensive. If the network is Expensive but not Constrained, your app should be considerate when fetching network resources while not imposing strict restraints. If the network is neither Constrained nor Expensive, your app can focus on providing the highest quality experience with minimal consideration for data usage.

Each networking framework uses the Constrained and Expensive indicators in specific ways. When using URLRequest, for example, your app can indicate which resource should be retrieved by setting the appropriate value on the allowsConstrainedNetworkAccess and allowsExpensiveNetworkAccess properties. In contrast, when using NWConnection, your app can access the state of the network through stateUpdateHandler as the isConstrained isExpensive properties of your connection’s currentPath. And, if your app uses AVFoundation instead of the Network framework or URLRequest, there are similar keys including AVURLAssetAllowsConstrainedNetworkAccessKey and AVURLAssetAllowsExpensiveNetworkAccessKey.

Regardless of technique, remember that the values for Constrained and Expensive are transient and can change as someone moves from one type of network connection to another. If your app dynamically monitors these changes, you’ll always provide the best experience for people, no matter their connection.

allowsExpensiveNetworkAccess

prohibitExpensivePaths

Provide a fallback Unless your app is designed specifically for a network with guaranteed performance characteristics, like a corporate or private connection, you should always make sure it functions well — even when there’s no network available at all. When someone initially downloads your app, make sure it delivers acceptable-quality assets as part of the bundle. If the app has periods of fast connectivity, you can then download higher-quality files and store them locally to ensure they’re available when someone leaves network range or goes offline entirely.

Support your surroundings Most cellular providers have prioritized 5G rollouts in high-density areas: entertainment venues such as sports stadiums and amphitheaters, transportation hubs like train stations and airports, centers for business and education, and points of interest like public parks and tourist landmarks. When people recognize they’re in a high-performance networking location, they may want to explicitly enable caching and other features in your app before heading to a destination with reduced coverage — consider incorporating interface elements that notify and enable people to immediately download any relevant content

Take advantage of built-in frameworks Apple’s hardware and on-device frameworks are tuned to deliver advanced functionality in a power-efficient manner. For example, you can use Core ML for on-device intelligence instead of client-server round trips, or ARKit and the Vision framework for capturing, processing, and presenting insightful information in the field. On-device processing minimizes the need to exchange large amounts of data — let alone potentially personally identifiable information — and eliminates the need for having to connect to a back-end process in order to provide a useful service in your app.

When you do need to move large amounts of data, you can lean on Smart Data Mode for 5G-enabled devices. This feature monitors your app’s state along with any Apple frameworks you’re currently using to automatically switch between existing cellular frequencies in a manner that ensures your app receives the highest possible bandwidth — all without sacrificing battery life.

For example, when an app is in the foreground and playing video using the AVFoundation framework, Smart Data Mode ensures that high-bandwidth 5G is enabled. In addition, Smart Data Mode monitors the streaming experience while someone is connected to a 5G network. If the stream is throttled due to traffic shaping — either by the cellular provider or the limitations of your CDN — the feature will identify the throttled throughput and move the stream to an LTE connection to conserve power. Background requests for data using the core networking frameworks can be served just as well over LTE or lower power frequencies.

Get out there

Previously, testing your app’s networking code involved toggling the network state, switching between Wi-Fi and cellular data, and then using a network conditioner and other tools to alter various characteristics. While this is still a great way to test for basic use cases, nothing beats getting out and exploring the edge-case scenarios only a deployed network can throw your way.

Start small The App Store has a variety of apps you can use to determine the network characteristics for a given area. With one of these apps and a regional carrier’s wireless coverage map, you can track down the perfect spots in your area to ensure your app is selecting the correct resources at the right time. And once you’ve found that perfect 5G networking spot, move to another where your coverage is sub-optimal and check your app. Did it keep running? Did your streaming content degrade as expected or move to local resources? Did it deliver acceptable-quality assets after fresh install? The more real-world use cases you can test in advance, the better the experience overall will be for people around the world.

Go big While many third-party websites provide performance data for cellular networks, they are an aggregate statistic and only an approximation of the performance at the location where someone might be using your app. Because this data is only valuable as a baseline, it’s not a substitute for knowing how your app performs in the wild. You can use Testflight for iOS to scale your beta testing to people and networks around the world. You may also want to consider creating a TestFlight group — not only to ensure your app is bug-free, but that it performs well based on the overall network characteristics of the cohort you’ve assembled.

Move to the edge While the location of your server isn’t something you can always control, at least try to influence or decide where your server infrastructure resides and take steps to minimize the distance between your server and your app. When you reduce the distance between people and your backend, it can vastly improve the network performance of your app. One way you can improve your own app’s connection is by selecting a hosting provider that can federate your server back-end to map closely to the cellular networks your app uses. Alternatively, you may also want to consider using a few strategically-located CDNs.

Reach out

5G networks provide a real opportunity to enhance your existing app with richer data or build entirely new experiences that were previously not possible. If youʼre working on creating an amazing experience with 5G and would like to share it with us, let us know.

Contact us

Learn more about supporting 5G in your apps

Posted on Leave a comment

WWDC21 Daily Digest: Day 4

A Memoji staring into an open MacBook Pro

Welcome to day 4 of WWDC, or — as we’re calling it — Apple Design Awards day! We’ve got a fresh round of session videos, labs, challenges and some hardware to hand out later on, as well as lots of other fun activities in our pavilions and digital lounges. Read on.

And the Apple Design Award winners are…

… being announced this afternoon! Stream the live presentation of the Apple Design Awards starting at 2 p.m. PDT. (Virtual rounds of applause will be accepted.)

WWDC21 Apple Design Awards

WWDC21 Apple Design Awards (ASL)

Get all caught up

Missed any of the fun this week? No worries: Our official recap videos will get you caught up in no time.

Wednesday@WWDC21

Tuesday@WWDC21

Monday@WWDC21

Day 4 in the WWDC pavilions

Another set of great sessions, labs, and activities have arrived in the pavilions: Try out a Framework Freestyle in the Essentials pavilion and learn a new framework in 100 lines of code or less. Discover how to design memorable SharePlay experiences in the Audio and Video pavilion. Sign up for one of Friday’s design labs in the Design pavilion. And get a glimpse of a magnificent future without passwords in the Privacy & Security pavilion.

Design for spatial interaction

Design for Group Activities

Discover rolling clips with ReplayKit

Create image processing apps powered by Apple Silicon

Optimize high-end games for Apple GPUs

Challenge: Framework Freestyle

Build Mail app extensions

Swift concurrency: Behind the scenes

Learn to meditate (even if you’re fidgety)

At 11 a.m. PDT, hear from special guest speaker Dan Harris, an Emmy-winning journalist, Good Morning America anchor and author of the best-selling book Meditation for Fidgety Skeptics. After having a nationally televised panic attack in 2004, Harris found himself on a long and often bizarre journey that ended with the discovery of mindfulness meditation. Today, Harris will discuss his journey, as well as the books, podcast, and app that have helped millions manage the stress and anxieties of today’s world—including previous non-believers like himself. (Want a sneak peek? Check out Harris’s app Ten Percent Happier.)

Meditation for fidgety skeptics

Lock down a lab appointment

There’s still one more day to register for a lab appointment with Apple engineers, designers, and specialists for 1-to-1 guidance and conversation.

Until Day 5…

That’s it for today! But rest up — we’ve got one more big day for you tomorrow.

Posted on Leave a comment

Challenge: Framework Freestyle

Framework icon with a question mark on top of it on a yellow background

No matter your level of expertise, it can be daunting to step out of your comfort zone when you’re first learning about new frameworks or technologies. Our challenge today presents a fun and interactive way to encourage you to try something new with an ARKit sample app and one framework of your choosing. What can you create in 100 lines of code or less?

Begin the challenge

This challenge is a gamified Augmented Reality experience created with RealityKit and ReplayKit. To participate, you’ll need to have downloaded the developer beta for iOS 15 and Xcode 13. After you do, download the Framework Freestyle sample project from this challenge and open it in Xcode, then build and run the app on your iOS device.

When you engage with the app, it triggers a mystery sequence of Apple frameworks, randomly selecting one of them. Here comes the fun part: We’re asking you to build something new using whatever framework the randomizer lands on — and do so using 100 lines of code or less! For example, if it lands on SwiftUI, you could experiment in Xcode with the canvas, or try making a basic search bar with .searchable. Don’t worry too much about building something perfect: Use this challenge to break the ice, learn, and have fun.

Resources

WWDC21 Challenge: Framework Freestyle

Read the WWDC21 Challenges Terms and Conditions