Posted on Leave a comment

WWDC22 Daily Digest: Thursday

It’s Day 4 of WWDC and a fresh round of Digital Lounges, labs, and activities awaits you. Before we kick off Thursday, catch up on yesterday’s dispatch:

WWDC22 Day 3 recap

Your Day 3 status report is here. Catch up on the latest from WWDC22 and discover what’s coming to a Thursday near you.

Spotlight on sessions, Digital Lounges, and labs

We’ve got a packed Thursday for you! Discover the latest updates to Vision, learn how to write for interfaces and collaborate with Messages, find out how to use SwiftUI with UIKit, create complications with WidgetKit, and — a WWDC first! — watch “Design for Arabic” delivered in Arabic by presenter Mohamed Samir.

What’s new in Vision

Learn about the latest updates to Vision APIs that help your apps recognize text, detect faces and face landmarks, and implement optical flow. We’ll take you through the capabilities of optical flow for video-based apps, show you how to update your apps with revisions to the machine learning…

Design for Collaboration with Messages

Discover how you can design great collaboration experiences using Apple platforms. We’ll show you how to combine the Share Sheet, live editing notifications, Messages, FaceTime, and your app’s existing collaboration features to help people connect and collaborate effortlessly. (Note: API will…

Writing for interfaces

The words and phrases you choose for your app matter. Whether you’re writing an alert, building an onboarding experience, or describing an image for accessibility, learn how you can design through the lens of language and help people get the most from your app. We’ll show you how to create clear,…

Use SwiftUI with UIKit

Learn how to take advantage of the power of SwiftUI in your UIKit app. Build custom UICollectionView and UITableView cells seamlessly with SwiftUI using UIHostingConfiguration. We’ll also show you how to manage data flow between UIKit and SwiftUI components within your app. To get the most out…

Design for Arabic · صمّم بالعربي

تعرّف على المبادئ الأساسية لتصميم الواجهات الرقمية باللغة العربية. سواء كنت تود تصميم تطبيق أو لعبة خصيصًا للاستخدام العربي، أو تود ترجمة تطبيق من لغة أخرى إلى…

And once you’re finished watching sessions for the day, join the Digital Lounges for more great Q&A from our design, photos, and Swift teams as well as live watch parties with the presenter of “Explore design navigation on iOS.”

Thursday is also your final day to request a lab appointment for a conversation with Apple engineers and designers. Come say hi and ask your questions!

Trivia Night is back

WWDC Trivia Night returns tonight at 6 p.m. PT in the Developer Tools Lounge. Put your brain to the test on such pressing questions as: What was the first supported programming language for Mac development? What’s the deal with 9:41? Come test your wits, compete with your friends and Apple staff, and suggest questions to stump the experts in the room.

Where we’re going, we don’t need roads

We’re going back in time with our Thursday Throwback SwiftUI challenge! Create a SwiftUI view that reimagines your app clothed in the interfaces of the past. Dress your UI up in the grayscale style of System 6, the linen of early iPhoneOS, or another time period entirely! Visit the SwiftUI Study Hall to collaborate on the “Throwback Thursday” coding challenge. Ask questions, connect with other developers, and share your creations or on Twitter using the hashtag #WWDC22Challenges.

Headphones on

An immersive app doesn’t only look and feel great — it has to sound incredible, too. During WWDC, we spoke with four Apple Design Award finalists about the sensational sounds of their apps and games: the incredible and immersive soundscape app Odio; the elegant and jazz-fueled Please, Touch the Artwork; the longstanding mindfulness resource Headspace; and the gorgeous tap-along rhythm game A Musical Story.

Sound advice

Inside the sublime audio of four Apple Design Award finalists.

Spin the music of WWDC

And speaking of music: Give our official WWDC playlists a listen — they’re perfect if you need chill background music or a little audio kick to get you going.

Listen to WWDC22 playlists on Apple Music

Have fun out there, and we’ll see you tomorrow to close out WWDC22!

Posted on Leave a comment

Challenge: Bindless ray tracing

Mirror, mirror on the … other mirror. In this challenge, we invite you to explore bindless rendering in Metal 3 and reflect rays on mirrored surfaces.

Thanks to the bindless enhancements in Metal 3, the HybridRendering sample app looks better than ever. It makes all scene resources available to its shaders using Argument Buffers, then uses Metal ray tracing to produce reflections on metallic surfaces — like the ones below.

But as beautifully as the app has drawn this scene, there’s still a limitation: It’s unable to show reflections within reflections, like the mirrored floor reflecting the mirrored sphere.

In fairness: It’s hard to show mirrors reflecting mirrors! Light infinitely bounces between the two surfaces, creating a situation that can’t be solved computationally. Ray tracing apps work around this issue by adding a limited number of light (or ray) “bounces” in the scene to provide more realism.

In this challenge, we invite you to extend that ray tracing code and increase your image’s realism by adding one (or more) extra ray bounces.

Begin the challenge

Before entering this hall of mirrors, we recommend first watching “Go bindless with Metal 3.” After you watch, download the “Rendering reflections in real time using ray tracing” sample code — we’ll be using it for this challenge.

Go bindless with Metal 3

Learn how you can unleash powerful rendering techniques like ray tracing when you go bindless with Metal 3. We’ll show you how to make your app’s bindless journey a joy by simplifying argument buffers, allocating acceleration structures from heaps, and benefitting from the improvements to the…

Rendering reflections in real time using ray tracing

The app has a dedicated compute pass that calculates reflections from a thin G-Buffer containing positions and normals for each pixel in the image.

The ray tracing shader reads this data and uses it with the camera’s view direction to calculate the direction of the reflected rays. It then uses Metal to trace these rays, find intersections, and shade reflections.

raytracing::ray r;
r.origin = positions.read(tid).xyz;
r.direction = normalize(directions.read(tid).xyz);
r.min_distance = 0.1;
r.max_distance = FLT_MAX; raytracing::intersector<raytracing::instancing, raytracing::triangle_data> inter;
inter.assume_geometry_type( raytracing::geometry_type::triangle );
auto intersection = inter.intersect( r, accelerationStructure, 0xFF );
if ( intersection.type == raytracing::intersection_type::triangle )
{ }

This produces the following image:

But there’s a problem! The fire trucks are missing from the sphere’s reflection on the floor. We challenge you to reveal the missing trucks by modifying the ray tracing shader, rtReflection, to add an additional ray trace step.

To complete this challenge, you’ll:

  1. Use the reflected normal and intersection position to calculate the next bounce of rays.
  2. Extract the material shading logic into a helper function that allows you to shade reflections within the reflections.
  3. Combine all reflected colors and write them into the outImage.

When you’re done, use the screenshot tool, GPU Debugger, or QuickTime to capture your solution and show us your work by posting it on Twitter with the hashtag #WWDC22Challenges. And if you’d like to discuss bindless ray tracing and other Graphics & Games topics, join the team at events throughout the remainder of the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Design for superheroic navigation

Calling all designers: We’re inviting you to use your powers to design a “super” navigation experience for an app that helps our developer heroes fight code-tastrophes and design disasters.

In this challenge, you’ll design a tab bar and screen of a fictional app to help your favorite superheroes. Whatever kind of app you decide to create, your challenge is to organize its core functionality into sections on a tab bar.

Bonus: If you’re feeling super-creative, design the root screen of one of that app’s tabs. What would your heroes need to view? What actions would they take? (Note: Though you’re just designing one screen, the features designed in this view should work in harmony with the other tabs in your proposed app.)

We also welcome you to visit the Design Study Hall to collaborate on this challenge! Ask questions, connect with other developers, and share your creations.

Begin the challenge

To get started, we recommend watching “Explore design navigation for iOS” to learn how you can take advantage of existing navigation structures to simplify complex interactions in your app without compromising its personality. Explore best practices and common pitfalls when working with tab bars, modality, and more.

We also recommend checking out “Writing for interfaces” to find out more about creating clear, conversational, and helpful labels and writing in your app.

Explore navigation design for iOS

Familiar navigation patterns can help people easily explore the information within your app — and save them from unnecessary confusion. We’ll show you how to take advantage of existing navigation structures to simplify complex interactions in your app without compromising its personality. Learn…

Writing for interfaces

The words and phrases you choose for your app matter. Whether you’re writing an alert, building an onboarding experience, or describing an image for accessibility, learn how you can design through the lens of language and help people get the most from your app. We’ll show you how to create clear,…

Once you’re ready to start designing, visit the Apple Design Resources page to download the iOS design template and get access to tab bar symbols and iOS system colors. We also recommend downloading and exploring the SF Symbols app to create compelling iconography for your tab bar.

iOS apps can have between two and five tabs — so consider which features would be most relevant for the superhero app you’re designing. Don’t forget to use descriptive and succinct labels for each tab!

Apple Design Resources

Download SF Symbols

Show us your super work by posting it on Twitter with the hashtag #WWDC22Challenges, or share your work in the Design Study Hall. And if you’d like to discuss other Design topics, join the team at events all throughout the remainder of the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Throwback Thursday with SwiftUI

Ready to add some chic retro styling to your SwiftUI views? In this challenge, we’re inviting you to create a SwiftUI view that reimagines your app or concept clothed in the interfaces of the past. Dress your UI up in the gorgeous grayscale of System 6, the linen of early iPhoneOS, or another time period entirely!

We also welcome you to visit the SwiftUI Study Hall to collaborate on this challenge. Ask questions, connect with other developers, and share your creations.

Begin the challenge

To get started, pick a year, era, or color scheme. Then, snap a screen from your app and take it back to those glory days. If you need a boost (or a challenge), use a random number generator to choose a year between 1984 and 2013 — or open up a Swift Playground:

print("Reimagine your app's interface like it's from the year \(myCoolRetroYear)!")
let myCoolRetroYear = Int.random(in: 1984..<2013)

Wherever you land, think about the Apple Design Languages prominent during that era. If you’re newer to SwiftUI, experiment with Xcode Previews to see how much code the tools will write for you. If you have more experience, take this chance to play around. (And we love a reboot: If you joined us for the original 2021 SwiftUI Throwback Challenge, feel free to resurrect your 2021 project.)

Next steps

Share your time-machine masterpiece on Twitter with the hashtag #WWDC22Challenges, or share your work in the SwiftUI Study Hall. And if you’d like to discuss other SwiftUI topics, join the team at events throughout the remainder of the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions

Posted on Leave a comment

Sound advice

An immersive app doesn’t only look and feel great — it has to sound incredible, too. During WWDC, we spoke with four Apple Design Award finalists about the sensational sounds of their apps and games. Come with us as we travel on a musical journey through cool jazz, Spatial Audio soundscapes, and even original album-length compositions from artists like Arcade Fire, St. Vincent, and Madlib.

Please, Touch the Artwork: A jazz thing

Thomas Waterzooi’s elegant puzzle game Please, Touch the Artwork is inspired by Dutch painter Piet Mondrian, who, with his iconic style of lines, squares, and primary colors, is considered a pioneer of 20th century abstract art. And that kind of guy? He needs the right kind of music.

“I tried to imagine what Mondrian would listen to while painting in his workshop,” says Waterzooi, the game’s Brussels-based developer. “Some kind of jazz. And since the game is designed to be relaxing, it would have to be a calm, dynamic kind of jazz.”

That mix of timeless art and cool soundtrack creates a classy vibe in Please, Touch the Artwork, whose puzzles are based on three of Mondrian’s most famous works: Composition with Red, Blue, and Yellow; Broadway Boogie Woogie; and New York City I.

As the puzzles grow and change, so does the music, which was created by composer Lars Burgwal. The music for the New York City section begins with only bass — and as you progress through each puzzle, the piano, saxophone, and vibraphone all come to play. (Waterzooi also added a little drum flourish whenever you tap a painting.) “With puzzle games, the music has to be relaxing,” says Waterzooi. “It can’t annoy you at any point.”

So not so much Broadway boogie woogie for Broadway Boogie Woogie? “It would be too fast!” laughs Waterzooi. “We couldn’t swing nearly that much.” He has, however, worked in a little nod to the style.

“The goal of that game is to join characters named Boogie and Woogie,” he says, “and when you do, there’s a little completion animation with a musical accent. It’s not much — just three or four notes — but it’s based on some boogie-woogie right-hand piano schemes.”

Download Please, Touch the Artwork on the App Store

Odio: Absolutely Spatial

Audio apps don’t get much more immersive than Odio. The Apple Design Award-winning 3D audio app employs a mesmerizing mix of Spatial Audio and head tracking to conjure up its AR soundscapes.

While you might flow between a rushing waterfall, the deep sea, and even a world of calming digital ambience, you’re no passive listener in these realistic realms: Each soundscape can be manipulated through a clever system of arcing sliders that reposition each sonic element — a rushing river, dreamy whalesong, or wash of digital static — around your head in 360 degrees.

Max Frimout is the app’s audio engineer, and though his work is heavy on synthetic, otherworldly digital elements, his audio career started with something considerably more analog. “I was originally a harpist,” he says. “One day I opened the ES1 Synthesizer in Logic Pro, and now I’m here!”

Odio originally focused on nature sounds, but after a few months of development, the Netherlands-based team at Volst wanted more. “‘What if we have musicians compose their own environments?’” says Roger Kemp, co-founder and designer at Volst. “That’s when it all clicked.”

Frimout is also one of the app’s five composers. A musician and DJ by trade, he began creating his Odio soundscapes with lines of melody, then layered in effects and flourishes with names like “synthetic water,” “moving chords,” and “filtered drone.” Soundscapes are built in Logic Pro and tested with AirPods Max. “That’s how I look around to hear how it feels,” he says.

Most of Frimout’s compositions are the result of sonic experimentation, but the soundscape called “Wow!” followed a more organic path. “I started with a series of melodies that basically all came to me in the same evening,” he says. “I think that shows how you can have all this equipment and all these concepts but still be incredibly inspired by a single event.”

And yes, it contains harp: That’s Frimout playing on the loop called Heartbreak — though you might not recognize the sound as strings. “It’s just three chordal structures,” he says with a laugh, “but they’ve been processed and processed and processed.”

Download Odio on the App Store

A Musical Story: That ‘70s game

A Musical Story is inspired by a very groovy time: “It’s all about the freedom of ‘70s music,” says Charles Bardin, the French composer/developer who created the game with art director Alexandre Rey, composer Valentin Ducloux, and developer Maxime Constantinian. “Mostly, we were inspired by the sense that, back then, anything could happen.”

Conceived in 2017 and launched in March 2022, A Musical Story is a harmonious mix of song, narrative, and art. The story follows an up-and-coming band trying to break into the business, replete with vintage guitars, outfits, and hairstyles. To move the narrative along, you tap your screen to the beat, creating some great soul- and R&B-inflected music in the process.

But the game is mostly wordless, driven by the primal, powerful connection between music and memory. It’s an ideal playground for Bardin, who studied at the Conservatoire de Musique de Lyon and who’s been creating and covering game music for more than a decade.

As it happens, the development process didn’t begin with the music — Bardin and Rey started by establishing the circular tap-along play mechanic. “In most games, the notes come down on the screen and you play them when they arrive,” says Bardin. “I love that, but it’s also something you can play without any sound. I wanted a game that really relies on listening.”

Once the team landed on the mechanic, it was time to tune into the songs themselves. “We knew we wanted short sequences of music to unlock the story,” says Bardin, “but a lot of musical games rely on electronic or techno music, where the beat is very clear. We wanted to prove that we could make more organic music — something that wasn’t quite so thump-thump-thump-thump.”

He also made sure the music drove the story along. “I wrote a song called Her for a scene in which the character goes to a pub, sees a girl playing music, and instantly falls in love with her,” says Bardin. “It begins with just a Rhodes piano and some bass and drums, but as you move closer to the stage, you hear more and more of the music. When you get close enough, you discover her face and her voice.” It’s the only time vocals appear in the game itself aside from the credits. “We wanted this moment to be powerful,” Bardin says. “This is the voice of the most important character in the game.”

Download A Musical Story on the App Store

Headspace: The music of mindfulness

Over the past few years, the meditation and mindfulness app Headspace has partnered with A-list musical artists to help people concentrate, relax, lock in, or nod off. With Focus Music (found, appropriately, in the Focus tab), the app has amassed an array of original music and playlists from artists like Arcade Fire, St. Vincent, Erykah Badu, Madlib, and even film composer Hans Zimmer.

Focus Music was designed in part by John Legend, the app’s chief music officer. “There’s so much possibility right here on our phones,” says Legend. “It can be a scary thing for some artists; it’s not what we’re used to. But if we take advantage of the possibilities, there are all these different ways to reach people.”

The singer-songwriter Aluna trained in reflexology, transcendental meditation, and tai chi — all skills she wove into her hour-long Headspace composition. To create it, she designed six-minute blocks of sound, grounded in specific spaces like a crackling campfire, bustling park in late afternoon, or dripping cave.

Strictly speaking, it was not her usual approach. “Normally when you write a song, you’re doing wordplay and you want dynamics,” she says. “It’s completely different from music that for an hour has no start and no finish.” (It’s also more complicated than it sounds — there’s a lot of difference between the sound of water dripping from a cave and dripping from your faucet.)

The science at the intersection of music and mindfulness is clear, says UC Berkeley cognitive neuroscience professor Sahar Yousef, who partnered with Headspace on Focus Music. “We know that when we play music in rehab facilities, people improve quicker,” Yousef says.

Here’s the (extremely abridged) explanation of what’s going on when you listen: Your brain forges connections via neural networks, the little zaps of electricity that constitute all your thoughts. The good news is that these networks can be manipulated, and you’re probably doing it right now. You can train yourself to think that the aroma of coffee means it’s time to wake up, and you can train your brain to recognize the music designed to chill you out.

In other words, these soundscapes serve as little life hacks. “Michael Phelps listened to Eminem before every race,” says Yousef. “This is the same thing.”

Download Headspace on the App Store

Posted on Leave a comment

Sign up now for WWDC22 labs and lounges

Register for labs and Digital Lounges to connect with Apple engineers, designers, and experts online all week long.

Digital Lounges

A wide variety of exciting activities are happening daily on Slack.

  • Ask questions at engineering and design Q&As.
  • Join or follow real-time text-based conversations while watching a session video together, and stay for a short Q&A at Meet the Presenter activities.
  • Get to know other developers and teams from Apple in a casual setting during icebreakers.
  • Experiment with the latest frameworks, try out design concepts, participate in challenges, and share your creations in study halls.
  • Test your trivia expertise against the best in the business on June 9.

Labs

Receive one-on-one guidance about development basics, complex concepts, and everything in between. Learn how to implement new Apple technologies, explore UI design principles, improve your App Store product page, and much more.

Lounges and labs are open to all members of the Apple Developer Program and Apple Developer Enterprise Program, as well as 2022 Swift Student Challenge winners.

Register for labs

Register for lounges

Learn about WWDC22

Posted on Leave a comment

Challenge: Create a reactive soundscape

Bring on the noise: It’s time for a sound design challenge! We’re inviting you to experiment with creative ways to manipulate sound on iPhone and iPad using its myriad sensors, inputs, and variable states. (Think of typing on the iOS keyboard — where the key sounds get gradually quieter the faster you type.) Explore over 70 audio files from Apple sound designers and create a sonic experience of your very own!

We also welcome you to visit the Design Study Hall during the day to collaborate on this challenge! Ask questions, connect with other developers and designers, and share your creations.

Begin the challenge

First, download our challenge sound library. In it, you’ll find more than 70 audio files from Apple sound designers, including:

  • ChromaticScale: Includes 13 one-shot audio files that make up the musical notes of a chromatic scale.
  • InstrumentalLoops: 155 bpm sound files that can be seamlessly looped, combined, and layered.
  • OneShots: One-shot sounds that can be used as alerts, notifications, and more.
  • Samples: Sustaining samples of one note each.
  • SineLoops: Looping pad sounds formed with sine waves. Much like the InstrumentalLoops, they can be layered, added, and removed.
  • SwitchesAndTaps: User-interface sounds.
  • Misc: A collection of fun and inspiring sounds.

Download sounds for the challenge

Using any of these attached sounds and AVAudioEngine, we invite you to create a sonic on-device experience that changes based on sensor input or device state. Consider factors like touch input, motion and acceleration sensing, GPS or compass position, ambient light sensing, and camera or microphone input.

  • Could you use touch gestures to morph a looping sound’s frequency or amplitude?
  • Could you use a device’s motion sensor to trigger a sound — or use the accelerometer to change the relative volumes of multiple sounds triggered at once?
  • Could you use GPS to change a notification sound based on your distance from home?
  • If your app detected a clap through the microphone input, could it trigger a sound? Could you use the amplitude of that microphone input to determine which sound plays?
  • Are there other variables that could be used to affect audio playback — time of day, weather, stock prices?
  • How could these ideas impact your existing projects?

For inspiration, check out the Human Interface Guidelines on creating great audio experiences, and explore Apple developer documentation.

Human Interface Guidelines – Playing Audio

AVAudioEngine

Core Motion

Core Location

Nearby Interaction

CMHeadphoneMotionManager

Share your sonic creations on Twitter with the hashtag #WWDC22Challenges, or share your work in the Design Study Hall. And if you’d like to discuss sound design and other design topics, join the team at events all throughout the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: SwiftUI science fiction!

With its robots, spaceships, and occasional brains floating in jars, science fiction is the perfect playground for a creative challenge with SwiftUI. And so we’re inviting you to conceptualize or recreate a scene from your sci-fi dreams. Whether you use floating numbers, glowing monochrome code, or something from another universe, this is your chance to build the interface of your science-fiction dreams… or nightmares!

Begin the challenge

Set the scene and picture the science-fiction world you want to create. Are you in the near future? Part of an underground insurgency questioning the status quo? Inside a mysterious building, known only to those who work there? Floating out by a broken moon?

How would you interact with devices in this world? What sort of technology would you use? Your mission is to create a SwiftUI view in Xcode that brings that interface into our reality. For inspiration, try out SwiftUI tricks like layout and content transitions, which can help you peer through the fabric of spacetime (design-wise, at least).

Next steps

When you’ve finished your dystopian masterpiece, share it on Twitter with the hashtag #WWDC22Challenges, or share your work in the SwiftUI Study Hall. And if you’d like to discuss this or other SwiftUI topics, join the team at events all throughout the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: WidgetKit workshop

Take a glance at this WidgetKit challenge, won’t you? With glanceable experiences coming to the Lock Screen in iOS 16, it’s the perfect time to try building your very own Lock Screen widget.

Begin the challenge

If you’re exploring glanceable technologies for the first time, welcome! Before we get started with the challenge, check out “Complications and widgets: Reloaded” for an overview of the latest updates to WidgetKit.

Complications and widgets: Reloaded

Our widgets code-along returns as we adventure onto the watchOS and iOS Lock Screen. Learn about the latest improvements to WidgetKit that help power complex complications on watchOS and can help you create Lock Screen widgets for iPhone. We’ll show you how to incorporate the latest SwiftUI views…

Once you’re ready to begin, it’s time to examine your app: What parts of it might work as a widget? After you’ve identified an aspect, explore configuration options and the best timeline for your model.

If you already have a Home Screen widget, you can also explore reusing your SwiftUI code. Not every Home Screen widget is a great candidate for a Lock Screen widget, so consider different approaches to find the right one for your app.

For extra credit: What happens when you deploy your code to the Apple Watch? Have you found yourself most of the way to an awesome watchOS complication? Would you make any modifications to make your experience feel at home on Watch?

How will you transform your Lock Screen? Show us what you’ve made on Twitter with the hashtag #WWDC22Challenges, or share your work in the WidgetKit Study Hall. And if you’d like to chat more about WidgetKit, join the team at events all throughout the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions

Posted on Leave a comment

Challenge: Draw with metal-cpp

Metal is the foundation for accelerated graphics and compute power on Apple platforms — and if you’re familiar with C++, now’s the perfect time to explore its incredible power. For this challenge, we’re inviting you to try out metal-cpp and render your own triangle, sphere, or even a mesh in Xcode.

We also welcome you to visit the Graphics & Games Study Hall during the day to collaborate on this challenge! Ask questions, connect with other developers, and share your creations.

Begin the challenge

Before you begin, you’ll want to watch “Program Metal in C++ with metal-cpp” and download the LearnMetalCPP project, which contains a series of C++ samples.

Program Metal in C++ with metal-cpp

Your C++ games and apps can now tap into the power of Metal. We’ll show you how metal-cpp helps you bridge your C++ code to Metal, explore how each manages object lifecycles, and demonstrate utilities that can help these language cooperate in your app. We’ll also share best practices for designing…

Download the LearnMetalCPP project

Open the project in Xcode, and choose 00-window.cpp as your base code. To render your image, you’ll need to set up a few things within your project.

First, create a MTL::RenderPipelineState object with a MTL::RenderPipelineDescriptor. To do this, you’ll need to create a function, like buildShaders(). In the code snippet below, we’ve provided the shader code needed to render a single triangle.

void Renderer::buildShaders()
{ using NS::StringEncoding::UTF8StringEncoding; const char* shaderSrc = R"( #include  using namespace metal; struct AAPLVertex { float3 position; half3 color; }; // Welcome to modify the mesh as you want constant AAPLVertex triangles[] = { { float3{ -0.8f, 0.8f, 0.0f }, half3{ 1.0, 0.3f, 0.2f } }, { float3{ 0.0f, -0.8f, 0.0f }, half3{ 0.8f, 1.0, 0.0f } }, { float3{ +0.8f, 0.8f, 0.0f }, half3{ 0.8f, 0.0f, 1.0 } } }; struct v2f { float4 position [[position]]; half3 color; }; v2f vertex vertexMain( uint vertexId [[vertex_id]]) { v2f o; o.position = float4( triangles[ vertexId ].position, 1.0 ); o.color = half3 ( triangles[ vertexId ].color ); return o; } half4 fragment fragmentMain( v2f in [[stage_in]] ) { return half4( in.color, 1.0 ); } )"; // TODO: Create a MTL::RenderPipelineDescriptor // TODO: Allocate a MTL::RenderPipelineState object
}

Then, extend the Renderer::draw( MTK::View* pView) function by setting a MTL::RenderPipelineState and inserting draw calls.

void Renderer::draw( MTK::View* pView )
{
... ...
}

After that:

  • Create the MTL::RenderPipelineDescriptor object and set up some properties.
  • Create the MTL::RenderPipelineState object.
  • Tip: Be careful with object lifecycles.

Ready to share your metal-cpp art with the community? Show us what you’ve made on Twitter with the hashtag #WWDC22Challenges, or share your work in the Graphics & Games Study Hall. And if you’d like to discuss metal-cpp and other Graphics & Games topics, join the team at events all throughout the week at WWDC22.

Explore #WWDC22Challenges on social media

Read the WWDC22 Challenges Terms and Conditions