swift – How can I visualize audio knowledge amplitude graphically utilizing UIGraphics iOS

swift – How can I visualize audio knowledge amplitude graphically utilizing UIGraphics iOS


I need to present an interactive audio waveform like this.

I’ve extracted the pattern knowledge utilizing AVAssetReader. Utilizing this knowledge, I am drawing a UIBezierPath in a Scrollview’s contentView. At present, after I pinch zoom-in or zoom-out the scrollView, I am downsampling the pattern knowledge to find out what number of samples are to be proven.

class WaveformView: UIView {
    var amplitudes: [CGFloat] = [] {
        didSet {
            setNeedsDisplay()
        }
    }

    override func draw(_ rect: CGRect) {
        guard let context = UIGraphicsGetCurrentContext(), !amplitudes.isEmpty else { return }

        // Arrange drawing parameters
        context.setStrokeColor(UIColor.black.cgColor)
        context.setLineWidth(1.0)
        context.setLineCap(.spherical)

        let midY = rect.top / 2
        let widthPerSample = rect.width / CGFloat(amplitudes.depend)

        // Draw waveform
        let path = UIBezierPath()

        for (index, amplitude) in amplitudes.enumerated() {
            let x = CGFloat(index) * widthPerSample
            let top = amplitude * rect.top * 0.8

            // Draw vertical line for every pattern
            path.transfer(to: CGPoint(x: x, y: midY - top))
            path.addLine(to: CGPoint(x: x, y: midY + top))
        }

        path.stroke()
    }
}

Added gesture deal with

@objc personal func handlePinch(_ gesture: UIPinchGestureRecognizer) {
        swap gesture.state {
        case .started:
            initialPinchDistance = gesture.scale
            
        case .modified:
            let scaleFactor = gesture.scale / initialPinchDistance
            var newScale = currentScale * scaleFactor
            newScale = min(max(newScale, minScale), maxScale)
            
            // Replace displayed samples with new scale
            updateDisplayedSamples(scale: newScale)
            print(newScale)
            // Keep zoom heart level
            let pinchCenter = gesture.location(in: scrollView)
            let offsetX = (pinchCenter.x - scrollView.bounds.origin.x) / scrollView.bounds.width
            let newOffsetX = (totalWidth * offsetX) - (pinchCenter.x - scrollView.bounds.origin.x)
            scrollView.contentOffset.x = max(0, min(newOffsetX, totalWidth - scrollView.bounds.width))
            
            view.layoutIfNeeded()
            
        case .ended, .cancelled:
            currentScale = scrollView.contentSize.width / (baseWidth * widthPerSample)
            
        default:
            break
        }
    }
personal func updateDisplayedSamples(scale: CGFloat) {
        let targetSampleCount = Int(baseWidth * scale)
        displayedSamples = downsampleWaveform(samples: rawSamples, targetCount: targetSampleCount)
        waveformView.amplitudes = displayedSamples
        
        totalWidth = CGFloat(displayedSamples.depend) * widthPerSample
        contentWidthConstraint?.fixed = totalWidth
        scrollView.contentSize = CGSize(width: totalWidth, top: 300)
    }
personal func downsampleWaveform(samples: [CGFloat], targetCount: Int) -> [CGFloat] {
        guard samples.depend > 0, targetCount > 0 else { return [] }
        
        if samples.depend <= targetCount {
            return samples
        }
        
        var downsampled: [CGFloat] = []
        let sampleSize = samples.depend / targetCount
        
        for i in 0..<targetCount {
            let startIndex = i * sampleSize
            let endIndex = min(startIndex + sampleSize, samples.depend)
            let slice = samples[startIndex..<endIndex]
            
            // For every window, take the utmost worth to protect peaks
            if let maxValue = slice.max() {
                downsampled.append(maxValue)
            }
        }
        
        return downsampled
    }

The next strategy works very inefficiently as everytime gesture.state is modified I am calculating the downsampled knowledge and performs UI operation based mostly on that. How can I implement this performance extra effectively for clean interplay?

author avatar
roosho Senior Engineer (Technical Services)
I am Rakib Raihan RooSho, Jack of all IT Trades. You got it right. Good for nothing. I try a lot of things and fail more than that. That's how I learn. Whenever I succeed, I note that in my cookbook. Eventually, that became my blog. 
rooshohttps://www.roosho.com
I am Rakib Raihan RooSho, Jack of all IT Trades. You got it right. Good for nothing. I try a lot of things and fail more than that. That's how I learn. Whenever I succeed, I note that in my cookbook. Eventually, that became my blog. 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here


Latest Articles

author avatar
roosho Senior Engineer (Technical Services)
I am Rakib Raihan RooSho, Jack of all IT Trades. You got it right. Good for nothing. I try a lot of things and fail more than that. That's how I learn. Whenever I succeed, I note that in my cookbook. Eventually, that became my blog.