Swift – Simple full screen loader

Hello,

resuming an old post, that help you to create an UIAlert extension to show a modal popup, https://www.albertopasca.it/whiletrue/objective-c-modal-view-in-navigation-and-tabbar-controller-projects/ , I’ve created a new modern implementation that use UIWindow.

The scope is to show a spinner (or a Lottie spinner, or whatever you prefer), centered in the screen with automatic or manually dismission. Background can be blurred or colored, like this screen:

Continue reading “Swift – Simple full screen loader”

Debug (and control) your Robotic Arm with iOS

This is my work in progress ~50cm Robotic Arm with 6 AXIS Servo motors, an Arduino and a RaspberryPi (for image recognition in phase 2):

Arduino Robotic Arm 6AXIS iOS controlled

While writing the C++ code in Arduino I found I needed a way to send easily and fast commands to Arduino with my iPhone using Bluetooth.

Basically I need to control my robotic arm using an external device instead of running and running again the code on the Arduino board.

For this reason: I’ve attached a BLE board to Arduino and I have created a simple app (completely written in SwiftUI 😍😎) that use BLE connection to connect to Arduino BLE board and send string commands that are parsed and executed.

Commands sended are like:

Continue reading “Debug (and control) your Robotic Arm with iOS”

iOS – Data Leakage: App background cache

Today I want to share a simple way to retrieve user data from a native iOS cache image, automatically generated when your app goes in background.

You can retrieve this kind of informations if your phone was lost or Jailbreaked or connecting it to your pc and open an old backup. You can use tools like iBackup Viewer and more…

Let’s see how it works!

Continue reading “iOS – Data Leakage: App background cache”

Swift – Native OCR reader

Starting from iOS 11, Apple introduces a new framework called Vision.

The Vision framework performs face and face landmark detection, text detection, barcode recognition, image registration, and general feature tracking. Vision also allows the use of custom Core ML models for tasks like classification or object detection.

https://developer.apple.com/documentation/vision

Today we implement with few lines of code one of the simplest features of this beautiful framework, the OCR reader.

Continue reading “Swift – Native OCR reader”