Monday at ISTE began with me frantically trying to find my first session in the San Antonio Convention Center (not an easy place to navigate – especially for those of us who are spatially challenged), only to discover that I needed a ticket to enter. Fortunately, it was the one Apple morning session that wasn’t full, so I boomeranged between the usher at the entrance and the ticket stand with admirable speed and found myself one of the last people to be welcomed into a hands-on session centered on Apple’s Swift Playgrounds app.
You may recall that I wrote about the iOS Swift playgrounds app when it first entered the scene, and I wasn’t all that impressed. It has had four or five updates since then, and there are definitely some areas of great improvement.
I still stand by my original assertion that students need to be pretty adept readers to take advantage of the app, and I wouldn’t use it with students with lower than a 4th grade reading level. However, the new “Accessories” tab that allows it to be used to control multiple hardware devices may be a game-changer. For example, my students could now control Lego EV3, Dash Robots, and Sphero, among other robots, using Swift Playgrounds. The advantage of this over other apps, such as Tickle, is that students will be switching from introductory block programming to more widely used line/text programming. There are plenty of tutorials within the app to ease this transition.
Another feature that I like about Swift Playgrounds is that it offers a recording function, so students can work on a tutorial and submit recordings of their solutions to the teacher as a reflection. You can also take pictures of your screen within the app, and export the code to PDF. There are hints within the tutorials, but later levels require that you put a little effort into solving the coding puzzles before you can receive any help. The app is definitely worth looking into if you are an educator working with students who already have some programming experience and are looking for the next step. Curriculum resources are available here.
My second session also happened to be sponsored by Apple (no ticket required for this one). In this session, we learned about Apple Clips, which is a video editing app that may eventually replace iMovie. This app is optimized for mobile use, as well as social media, and it is clear there was a lot of thought put into its development. Just like iMovie, Clips allows you to take video, edit it, and add music. But Clips has taken a lot of the manual labor out of video creation. Music is automatically edited with intros and outros to fit your clips. Cropping and “Ken Burns-ing” easily become seamless portions of your video, and you can add layers, effects, and titles with taps of the finger. One of my favorite features is the “live titles.” This basically allows you to create a closed-captions for your video – adding text to the video as you record in real time. The text is aligned to the actual timing of your speech, so if you pause, so does the text. You can also easily edit the text if your words aren’t interpreted the way you intended.
Clips looks great. Designed for this generation of “on-the-fly” videographers, it could be the ideal tool. However, I have heard from a few people and read in some reviews that it can be glitchy. I have not experienced any issues myself, but I was disappointed when my somewhat older classroom iPad was deemed too ancient to be “compatible with this app.” Like many new products, Clips may need to age a bit (but maybe not as much as my unfortunate iPad) before it takes off, but I’m ready to give it a try.
For some examples of ways that Clips has been used in schools, check out #classroomclips on Twitter.