Post

Replies

Boosts

Views

Activity

Max 16k images for Image Classifier training????
I'm hitting a limit when trying to train an Image Classifier. It's at about 16k images (in line with the error info) - and it gives the error: IOSurface creation failed: e00002be parentID: 00000000 properties: { IOSurfaceAllocSize = 529984; IOSurfaceBytesPerElement = 4; IOSurfaceBytesPerRow = 1472; IOSurfaceElementHeight = 1; IOSurfaceElementWidth = 1; IOSurfaceHeight = 360; IOSurfaceName = CoreVideo; IOSurfaceOffset = 0; IOSurfacePixelFormat = 1111970369; IOSurfacePlaneComponentBitDepths = ( 8, 8, 8, 8 ); IOSurfacePlaneComponentNames = ( 4, 3, 2, 1 ); IOSurfacePlaneComponentRanges = ( 1, 1, 1, 1 ); IOSurfacePurgeWhenNotInUse = 1; IOSurfaceSubsampling = 1; IOSurfaceWidth = 360; } (likely per client IOSurface limit of 16384 reached) I feel like I was able to use more images than this before upgrading to Sonoma - but I don't have the receipts.... Is there a way around this? I have oodles of spare memory on my machine - it's using about 16gb of 64 when it crashes... code to create the model is let parameters = MLImageClassifier.ModelParameters(validation: .dataSource(validationDataSource), maxIterations: 25, augmentation: [], algorithm: .transferLearning( featureExtractor: .scenePrint(revision: 2), classifier: .logisticRegressor )) let model = try MLImageClassifier(trainingData: .labeledDirectories(at: trainingDir.url), parameters: parameters) I have also tried the same training source in CreateML, it runs through 'extracting features', and crashes at about 16k images processed. Thank you
1
0
524
Oct ’24
New Vison api - CoreML - "The VNDetectorProcessOption_ScenePrints required option was not found"
I'm trying to run a coreML model. This is an image classifier generated using: let parameters = MLImageClassifier.ModelParameters(validation: .dataSource(validationDataSource), maxIterations: 25, augmentation: [], algorithm: .transferLearning( featureExtractor: .scenePrint(revision: 2), classifier: .logisticRegressor )) let model = try MLImageClassifier(trainingData: .labeledDirectories(at: trainingDir.url), parameters: parameters) I'm trying to run it with the new async Vision api let model = try MLModel(contentsOf: modelUrl) guard let modelContainer = try? CoreMLModelContainer(model: model) else { fatalError("The model is missing") } let request = CoreMLRequest(model: modelContainer) let image = NSImage(named:"testImage")! let cgImage = image.toCGImage()! let handler = ImageRequestHandler(cgImage) do { let results = try await handler.perform(request) print(results) } catch { print("Failed: \(error)") } This gives me Failed: internalError("Error Domain=com.apple.Vision Code=7 "The VNDetectorProcessOption_ScenePrints required option was not found" UserInfo={NSLocalizedDescription=The VNDetectorProcessOption_ScenePrints required option was not found}") Please help! Am I missing something?
2
0
564
Oct ’24
Can you specify port for NWListener.Service without creating an NWListener on that port
I'm running a webserver for a specific service on port 8900 I'm using telegraph to run the webserver, so that opens and claims the port. I also want to advertise the service on bonjour - ideally with the correct port. This is trivial with NetService - but that's deprecated, so I should probably move to the Network framework. I can advertise without specifying a port listener = try NWListener(service: service, using: .tcp) but, then my service broadcasts addresses with port:61443 I can advertise using listener = try NWListener(using: .tcp, on: <myport>) however, that fails in my use case because (unsurprisingly) the Listener isn't able to get the port (my server already has it) Is this just a gap in the new API, or am I missing something?
1
0
810
Nov ’23
How to disable position animation in SwiftUI when button style is animated
I have a simple button style which animates the size of the button when it is pressed. Unfortunately, this causes the position of the button to animate (which I don't want). Is there a way to limit the animation only to the scale? I have tried surrounding the offending animation with .animation(nil) and .animation(.none) modifiers - but that didn't work. import SwiftUI struct ExampleBackgroundStyle: ButtonStyle {     func makeBody(configuration: Self.Configuration) -> some View {     configuration.label       .padding()       .foregroundColor(.white)       .background(Color.red)       .cornerRadius(40)       .padding(.horizontal, 20)       .animation(nil)       .scaleEffect(configuration.isPressed ? 0.6 : 1.0)       .animation(.easeIn(duration: 0.3))       .animation(nil)   } } when I show a button in a sheet, it animates in from the top left (0,0) to the position in the centre of the sheet after the sheet appears. If I remove the animation on the scaleEffect, then this doesn't happen.
3
0
4.2k
Mar ’22
Why have 2011 videos been delisted?
The 2020 session 'Author fragmented MPEG-4 content with AVAssetWriter' refers to the 2011 session 'Working with media in AVFoundation' (and when I say 'refers', I mean it says 'I'm not going to cover this important part - just look at the 2011 video) However - that session seems to be deleted/gone Frustrating... #wwdc20-10011
0
0
610
Oct ’21
Max 16k images for Image Classifier training????
I'm hitting a limit when trying to train an Image Classifier. It's at about 16k images (in line with the error info) - and it gives the error: IOSurface creation failed: e00002be parentID: 00000000 properties: { IOSurfaceAllocSize = 529984; IOSurfaceBytesPerElement = 4; IOSurfaceBytesPerRow = 1472; IOSurfaceElementHeight = 1; IOSurfaceElementWidth = 1; IOSurfaceHeight = 360; IOSurfaceName = CoreVideo; IOSurfaceOffset = 0; IOSurfacePixelFormat = 1111970369; IOSurfacePlaneComponentBitDepths = ( 8, 8, 8, 8 ); IOSurfacePlaneComponentNames = ( 4, 3, 2, 1 ); IOSurfacePlaneComponentRanges = ( 1, 1, 1, 1 ); IOSurfacePurgeWhenNotInUse = 1; IOSurfaceSubsampling = 1; IOSurfaceWidth = 360; } (likely per client IOSurface limit of 16384 reached) I feel like I was able to use more images than this before upgrading to Sonoma - but I don't have the receipts.... Is there a way around this? I have oodles of spare memory on my machine - it's using about 16gb of 64 when it crashes... code to create the model is let parameters = MLImageClassifier.ModelParameters(validation: .dataSource(validationDataSource), maxIterations: 25, augmentation: [], algorithm: .transferLearning( featureExtractor: .scenePrint(revision: 2), classifier: .logisticRegressor )) let model = try MLImageClassifier(trainingData: .labeledDirectories(at: trainingDir.url), parameters: parameters) I have also tried the same training source in CreateML, it runs through 'extracting features', and crashes at about 16k images processed. Thank you
Replies
1
Boosts
0
Views
524
Activity
Oct ’24
New Vison api - CoreML - "The VNDetectorProcessOption_ScenePrints required option was not found"
I'm trying to run a coreML model. This is an image classifier generated using: let parameters = MLImageClassifier.ModelParameters(validation: .dataSource(validationDataSource), maxIterations: 25, augmentation: [], algorithm: .transferLearning( featureExtractor: .scenePrint(revision: 2), classifier: .logisticRegressor )) let model = try MLImageClassifier(trainingData: .labeledDirectories(at: trainingDir.url), parameters: parameters) I'm trying to run it with the new async Vision api let model = try MLModel(contentsOf: modelUrl) guard let modelContainer = try? CoreMLModelContainer(model: model) else { fatalError("The model is missing") } let request = CoreMLRequest(model: modelContainer) let image = NSImage(named:"testImage")! let cgImage = image.toCGImage()! let handler = ImageRequestHandler(cgImage) do { let results = try await handler.perform(request) print(results) } catch { print("Failed: \(error)") } This gives me Failed: internalError("Error Domain=com.apple.Vision Code=7 "The VNDetectorProcessOption_ScenePrints required option was not found" UserInfo={NSLocalizedDescription=The VNDetectorProcessOption_ScenePrints required option was not found}") Please help! Am I missing something?
Replies
2
Boosts
0
Views
564
Activity
Oct ’24
Is there a way to stop Xcode showing historical warnings/errors
Xcode has this 'special' habit of showing old errors on builds. Sometimes from long ago. presumably there is a cache somewhere that isn't getting invalidated. Clearing the build folder doesn't fix this. Has anyone found a way that does?
Replies
3
Boosts
6
Views
821
Activity
Jun ’24
Can you publish an unlisted app for tvOS? How to install it?
I have an existing unlisted app on MacOS. I'd like to launch a tvOS version. However I can't figure out how the users would actually install the app. Unlike Mac/iOS, there isn't a browser for them to open the install link...
Replies
2
Boosts
0
Views
783
Activity
Dec ’23
Can you specify port for NWListener.Service without creating an NWListener on that port
I'm running a webserver for a specific service on port 8900 I'm using telegraph to run the webserver, so that opens and claims the port. I also want to advertise the service on bonjour - ideally with the correct port. This is trivial with NetService - but that's deprecated, so I should probably move to the Network framework. I can advertise without specifying a port listener = try NWListener(service: service, using: .tcp) but, then my service broadcasts addresses with port:61443 I can advertise using listener = try NWListener(using: .tcp, on: <myport>) however, that fails in my use case because (unsurprisingly) the Listener isn't able to get the port (my server already has it) Is this just a gap in the new API, or am I missing something?
Replies
1
Boosts
0
Views
810
Activity
Nov ’23
How to disable position animation in SwiftUI when button style is animated
I have a simple button style which animates the size of the button when it is pressed. Unfortunately, this causes the position of the button to animate (which I don't want). Is there a way to limit the animation only to the scale? I have tried surrounding the offending animation with .animation(nil) and .animation(.none) modifiers - but that didn't work. import SwiftUI struct ExampleBackgroundStyle: ButtonStyle {     func makeBody(configuration: Self.Configuration) -> some View {     configuration.label       .padding()       .foregroundColor(.white)       .background(Color.red)       .cornerRadius(40)       .padding(.horizontal, 20)       .animation(nil)       .scaleEffect(configuration.isPressed ? 0.6 : 1.0)       .animation(.easeIn(duration: 0.3))       .animation(nil)   } } when I show a button in a sheet, it animates in from the top left (0,0) to the position in the centre of the sheet after the sheet appears. If I remove the animation on the scaleEffect, then this doesn't happen.
Replies
3
Boosts
0
Views
4.2k
Activity
Mar ’22
Why have 2011 videos been delisted?
The 2020 session 'Author fragmented MPEG-4 content with AVAssetWriter' refers to the 2011 session 'Working with media in AVFoundation' (and when I say 'refers', I mean it says 'I'm not going to cover this important part - just look at the 2011 video) However - that session seems to be deleted/gone Frustrating... #wwdc20-10011
Replies
0
Boosts
0
Views
610
Activity
Oct ’21