Aloha Quick Lookers,
I'm using the usdzconvert preview 0.64 to create *.usdz files, which I then edit in ascii *.usda format, and then recipe them as *.usdz.
This way I was able to fix the scale (the original gltf's usually are in m=1, while the usdzconvert 0.64 always sets the result to m=0.01).
Now I was trying to follow the docs
to anchor my Glasses prim on the users face and whatever I try, it will only ever place it on my tables surface.
If I import my *.usdz file into Reality Composer and export it to usdz It does get anchored to the face correctly.
Now when I open the Reality-Composer-Export-usdz it really doesn't look so different from my manually edited usdz (it just wraps the Geometry in another layer, I assume because the import-export through Reality-Composer).
What am I doing wrong?
Here's the Reality Composer generated usda:
#usda 1.0
(
autoPlay = false
customLayerData = {
string creator = "com.apple.RCFoundation Version 1.5 (171.5)"
string identifier = "9AAF5C5D-68AB-4034-8037-9BBE6848D8E5"
}
defaultPrim = "Root"
metersPerUnit = 1
timeCodesPerSecond = 60
upAxis = "Y"
)
def Xform "Root"
{
def Scope "Scenes" (
kind = "sceneLibrary"
)
{
def Xform "Scene" (
customData = {
bool preliminary_collidesWithEnvironment = 0
string sceneName = "Scene"
}
sceneName = "Scene"
)
{
token preliminary:anchoring:type = "face"
quatf xformOp:orient = (0.70710677, 0.70710677, 0, 0)
double3 xformOp:scale = (1, 1, 1)
double3 xformOp:translate = (0, 0, 0)
uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:orient", "xformOp:scale"]
// ...
and here is my own , minimal, usdz with the same face-anchoring-token:
#usda 1.0
(
autoPlay = false
customLayerData = {
string creator = "usdzconvert preview 0.64"
}
defaultPrim = "lupetto_local"
metersPerUnit = 1
timeCodesPerSecond = 60
upAxis = "Y"
)
def Xform "lupetto_local" (
assetInfo = {
string name = "lupetto_local"
}
kind = "component"
)
{
def Scope "Geom"
{
def Xform "Glasses"
{
token preliminary:anchoring:type = "face"
double3 xformOp:translate = (0, 0.01799999736249447, 0.04600000008940697)
uniform token[] xformOpOrder = ["xformOp:translate"]
// ...
I'd add the full files, but they are 2Mb of size and the max file size is 200kb. I could create a minimal example file with less geometry in case the above code is not enough.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I tried animating the scrollTo() like so, as described in the docs. - https://developer.apple.com/documentation/swiftui/scrollviewreader
swift
withAnimation {
scrollProxy.scrollTo(index, anchor: .center)
}
the result is the same as if I do
swift
withAnimation(Animation.easeIn(duration: 20)) {
scrollProxy.scrollTo(progress.currentIndex, anchor: .center)
}
I tried this using the example from the ScrollViewReader docs.
With the result that up and down scrolling has exactly the same animation.
struct ScrollingView: View {
@Namespace var topID
@Namespace var bottomID
var body: some View {
ScrollViewReader { proxy in
ScrollView {
Button("Scroll to Bottom") {
withAnimation {
proxy.scrollTo(bottomID)
}
}
.id(topID)
VStack(spacing: 0) {
ForEach(0..100) { i in
color(fraction: Double(i) / 100)
.frame(height: 32)
}
}
Button("Top") {
withAnimation(Animation.linear(duration: 20)) {
proxy.scrollTo(topID)
}
}
.id(bottomID)
}
}
}
func color(fraction: Double) - Color {
Color(red: fraction, green: 1 - fraction, blue: 0.5)
}
}
struct ScrollingView_Previews: PreviewProvider {
static var previews: some View {
ScrollingView()
}
}
In our app, we're streaming short HLS streams to local AVPlayers.
In the view we have an AVPlayerLayer which is connected to the AVPlayer we start, and usually we hear audio and see the video.
Sometimes, instead of video, we'll see a grey screen, while still being able to hear the audio.
The playerlayer will be a shade of grey, which is not a shade coming from our app. Also if for testing purposes we don't connect the player to the PlayerLayer, this grey color will not be there, so it's definitely coming from the AVPlayerLayer.
If this is somehow related to an HLSStream becoming corrupted or something, what are the API's in AVFoundation we could use to debug this?
So far, when this happens we see no peculiar catch blocks or errors being thrown in our debug logs.
In an app with multiple AVPlayers (think TikTok UX for example), where:
• one video is currently streaming
• other videos , that users will move to next, have already been loaded into an AVPlayer for prebuffering
Is there an API to give the currently playing Stream higher priority than the pre-buffering ones that are not on screen?
It's nice to have instant play while scrolling but we also don't want to starve the currently playing player of bandwidth.
So, I've been following along the video on "Distribute binary frameworks as Swift packages" - https://developer.apple.com/videos/play/wwdc2020/10147/, but now I'm sort of stuck.
I have my Swift Package that works fine when used as a source package, which I want to ship as a binary now.
The video says, I'm supposed to add a target:
swift
import PackageDescription
let package = Package(
name: "Test",
defaultLocalization: "de",
platforms: [
.iOS(.v13)
],
products: [
.library(
name: "Test",
targets: ["Test"]
),
],
dependencies: [
],
targets: [
// .target(name: "Test")
.binaryTarget(
name: "Test",
url: "https://static.looc.io/Test/Test-1.0.0.xcframework.zip",
checksum: "9848327892347324789432478923478"
)
]
)
but, the xcframework is what I am trying to build, don't have an xcframework yet?
With the above, if I run:
bash
[konrad@iMac-2 Source]$ xcodebuild archive -workspace Test -scheme Test \archivePath "tmp/iOS" \
destination "generic/platform=iOS" \
SKIP_INSTALL=NO BUILD_LIBRARY_FOR_DISTRIBUTION=YES
I get the error
bash
xcodebuild: error: Could not resolve package dependencies:
artifact of binary target 'Test' failed download: invalid status code 403
Is anyone aware of any work or has anyone ever written an anisotropic shader using glsl for SceneKit?
It's common place in renderers like Blender or Maya (can't link to the docs as the dev forums don't let me post urls, but it's easy to google).
Unity has anisotropic shader packages too, so it doesn't seem to be impossible to achieve in a game engine.