Overview

Post

Replies

Boosts

Views

Activity

腾龙公司游戏会员注册账号登录网址TL 9655.com
果敢腾 龙企业有限公司,薇---184-7933-278成立于2006年8月12日,下面给大家介绍一下公司主要位置和主要经营那些行业,腾龙公司位置靠于云南省临沧市边联的一个小城市,名称果敢老街,实时位置果敢城市中心双峰塔附近,公司主要是经营,旅游业,酒店服务行业,建筑业,科技游戏,餐饮,等等,这个有名的小城市虽然不是很大,但有着纸醉金迷小澳门的名誉之称呼,今天给大家介绍的就这些,如果你有想旅游的心,也欢迎来这个小城市旅游,来感受一下这里的民族风情。
0
0
285
4w
Can SwiftUI Views serve as delegates?
Structs are value types, and the SwiftUI gets reinitialized many times throughout its lifecycle. Whenever it gets reinitialized, would the reference that the delegator has of it still work if the View uses @State or @StateObject that hold a persistent reference to the views data? protocol MyDelegate: AnyObject { func didDoSomething() } class Delegator { weak var delegate: MyDelegate? func trigger() { delegate?.didDoSomething() } } struct ContentView: View, MyDelegate { private let delegator = Delegator() @State counter = 1 var body: some View { VStack { Text("\(counter)") Button("Trigger") { delegator.trigger() } } } func didDoSomething() { counter += 1 //would this call update the counter in the view even if the view's instance is copied over to the delegator? } }
0
0
43
2w
We are getting a blank image after capturing and compressing the picture.
We used below method to resize image while compress the image, Below method is correct or need to do the correction in method or "CGBitmapContextCreate" -(UIImage *)resizeImage:(UIImage *)anImage width:(int)width height:(int)height { CGImageRef imageRef = [anImage CGImage]; CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef); if (alphaInfo == kCGImageAlphaNone) alphaInfo = kCGImageAlphaNoneSkipLast; CGContextRef bitmap = CGBitmapContextCreate(NULL, width, height, CGImageGetBitsPerComponent(imageRef), 4 * width, CGImageGetColorSpace(imageRef), alphaInfo); CGContextDrawImage(bitmap, CGRectMake(0, 0, width, height), imageRef); CGImageRef ref = CGBitmapContextCreateImage(bitmap); UIImage *result = [UIImage imageWithCGImage:ref]; CGContextRelease(bitmap); CGImageRelease(ref); return result; }
0
0
294
3w
Issuer Functional Requirements Apple Pay Specifications Version 3.5
I'm seeking clarification on how Requirement 4.1 ("Card Issuers with a Mobile App must support In-App provisioning") applies when the card issuer uses a third-party mobile banking platform rather than a self-developed app. Our situation: We are a small credit union (the card issuer) Our mobile banking app is provided by a third-party digital banking vendor (white-label, but branded with our name) Card processing is handled by a separate vendor The ambiguity: The Apple Pay Specifications define "Card Issuer Mobile App" as: "The Card Issuer-branded, iOS software application made available on a Device that is used by such Card Issuer's customers to manage, administer, or use Cards." Our mobile banking app meets this definition—it's branded with our name and used by our members to manage their accounts and cards. However, we don't develop or directly control the app; our digital banking vendor does. The webinar FAQ stated: "Do we have to implement in-app provisioning? Yes, if you have an app." Our digital banking vendor interprets this as not applying to them because they are "not the issuer." They've stated: "Apple's requirements are at the card-processor level... our credit unions and, by extension, we are not required to support Apple Pay's in-app provisioning." Our card processor has indicated they will support in-app provisioning integrations but notes "this would be digital provisioning and we would need the digital banking vendor to work with us to enable." Specific questions: When a card issuer uses a third-party mobile banking app (branded for the issuer but developed/maintained by a vendor), does Requirement 4.1 apply? If yes, who bears compliance responsibility—the issuer, the mobile app vendor, or both? If the mobile app vendor does not implement in-app provisioning by January 15, 2026, what is the issuer's exposure? Does the issuer face suspension from the Program due to vendor non-compliance? Is there an alternative compliance path under Requirement 4.8 (Web Provisioning) for issuers whose mobile app vendors cannot deliver in-app provisioning by the deadline? This scenario likely affects hundreds of small financial institutions using shared digital banking platforms. Clarity on vendor vs. issuer responsibility would help the entire ecosystem prepare appropriately. Thank you.
0
1
130
2w
Xcode and Anthropic: usage limits
Hi, I have been using Claude with Xcode 26.1.1 and it was working fine till a couple of days ago, when it started giving me an error message on every query: "Message from Anthropic: This request would exceed your account's rate limit. Please try again later." I am on the Pro model, and I can use the Claude.ai website just fine. The usage limits on the website show I've used only 11% of the weekly limit, and 4% of the 'current session'. So I'm wondering if this is an Xcode bug / issue, and if there is a workaround. I have restarted Xcode, restarted the Mac, logged out of Anthropic and logged back in, tried Xcode 26.2 beta as well, but no luck.
0
0
56
3w
Switching default input/output channels using Core Audio
I wrote a Swift macOS app to control a PCI audio device. The code switches between the default output and input channels. As soon as I launch the Audio-Midi Setup utility, channel switching stops working. The driver properties allow switching, but the system doesn't respond. I have to delete the contents of /Library/Preferences/Audio and reset Core Audio. What am I missing? func setDefaultChannelsOutput() { guard let deviceID = getDeviceIDByName(deviceName: "PCI-424") else { return } let selectedIndex = DefaultChannelsOutput.indexOfSelectedItem if selectedIndex < 0 || selectedIndex >= 24 { return } let channel1 = UInt32(selectedIndex * 2 + 1) let channel2 = UInt32(selectedIndex * 2 + 2) var channels: [UInt32] = [channel1, channel2] var propertyAddress = AudioObjectPropertyAddress( mSelector: kAudioDevicePropertyPreferredChannelsForStereo, mScope: kAudioDevicePropertyScopeOutput, mElement: kAudioObjectPropertyElementWildcard ) let dataSize = UInt32(MemoryLayout<UInt32>.size * channels.count) let status = AudioObjectSetPropertyData(deviceID, &propertyAddress, 0, nil, dataSize, &channels) if status != noErr { print("Error setting default output channels: \(status)") } }
0
0
255
3w
Paste button on navigation bar is not shown on iOS/iPadOS 26
The Paste button (using UIPasteControll) located on UINavigationBar is not shown at application startup on iOS/iPadOS 26. This issue will disappear when device is rotated or window size is changed. And this issue does not occur on iOS / iPadOS 18 and earlier. I uploaded the sample project to github at the following URL. https://github.com/gpn-galapagos/PasteButtonApp.git Has anyone had the same issue or knows a workaround?
0
0
172
3w
cybersource Payment Gateway not able to decrypt paymenttoken
Cybersource production support has clarified issue as below "On the BAD Case, it seems that the Apple Payload did not contain the "onlinePaymentCryptogram" object within the JSON. The Cryptogram is critical and mandatory. Since the merchant cannot really control this, and since CYBS is just decrypting the payload and uses it, we cannot comment as to why it was missing. The merchant would need to reach out to Apple and/or decrypt the payment themselves locally to check if and why this data was not present, for troubleshooting purposes."
0
0
41
3w
Test subscribe on test flight
Hi! I created a subscription class and a button that starts the purchase flow. In the Xcode environment and Simulator everything works correctly — the purchase sheet appears and the subscription flow works as expected. But when I test the app in TestFlight, the subscription button doesn’t appear at all. I cannot trigger a purchase, and I can’t find a clear tutorial that explains how to make subscriptions work specifically in TestFlight. Here is what I have already done: I created the subscription in App Store Connect I set up a Sandbox account and logged in on the device. StoreKit sync works and the product ID matches the one in App Store Connect. The same code works perfectly in Xcode Debug and Simulator, but the button is missing in TestFlight. I’m totally stuck. Can someone explain how to correctly set up and test subscriptions in TestFlight and why the subscription button would disappear even though everything works in Xcode? Any help or guidance would be greatly appreciated! THIS IS SUB CLASS @Published private(set) var products: [Product] = [] @Published private(set) var activeSubscriptions: Set<StoreKit.Transaction> = [] private var updates: Task<Void, Never>? var hasActiveSubscription: Bool { activeSubscriptions.contains { transaction in guard let expirationDate = transaction.expirationDate else { return false } return expirationDate > Date() && transaction.revocationDate == nil } } init() { updates = Task { for await update in StoreKit.Transaction.updates { if case .verified(let transaction) = update { activeSubscriptions.insert(transaction) await transaction.finish() } } } } deinit { updates?.cancel() } // MARK: PRODUCT LOADING func fetchProducts() async { do { let ids = ["test_name"] products = try await Product.products(for: ids) } catch { print("Failed to load products: \(error)") products = [] } } // MARK: PURCHASE func purchase(_ product: Product) async throws { let result = try await product.purchase() switch result { case let .success(.verified(transaction)): await transaction.finish() case .success(.unverified): break case .pending: break case .userCancelled: break @unknown default: break } } // MARK: ACTIVE SUBSCRIPTIONS func fetchActiveSubsciptions() async { var active: Set<StoreKit.Transaction> = [] for await entitlement in StoreKit.Transaction.currentEntitlements { if case .verified(let transaction) = entitlement { active.insert(transaction) } } self.activeSubscriptions = active } > ``` THIS IS BUTTON ```VStack(spacing: 8) { ForEach(sub.products) { product in PrimaryButtonView(text: mainButtonTitle) { Task { do { try await sub.purchase(product) } catch { print("Purchase error: \(error)") } } } } }```
0
0
47
3w
Question Regarding Campaign Data Limits in “App Store Downloads” Analytics Report
Regarding the “App Store Downloads” Analytics Report, I have submitted a report generation request with accessType: ONE_TIME_SNAPSHOT. Reference: https://developer.apple.com/documentation/analytics-reports/app-download I would like to confirm the scope of the Campaign data included in the report generated by this request. In the App Store Connect UI, I understand that there is an upper limit of 200 Campaign entries displayed. For the report output, however, could you please confirm whether it includes all Campaigns measured on the App Store Connect side? Or is there also any upper limit on the number of Campaign records included in the report itself?
0
0
87
3w
Swift UI Custom Keyboard
How can I correctly display the cursor using a custom keyboard in SwiftUI without using UIKit? Currently, I'm encountering a conflict between the custom keyboard and the system keyboard in SwiftUI, resulting in both keyboards being displayed simultaneously. If I disable the system keyboard and then handle the custom keyboard, the cursor disappears. How can I resolve this issue?it?
0
0
79
4w
FPS not able to do > 30 fps for Pro and ProMax models only
We’re developing an AVFoundation-based video recording app (4K @ 60 fps required for biomechanical analysis). On most devices this works perfectly (iPhone 12/14/15/16 non-Pro models), but on several iPhone Pro models (12 Pro, 13 Pro, 14 Pro, 15 Pro/Pro Max), we consistently get 4K 30 fps recordings—even when the device should support 4K 60 fps on the wide-angle camera. What we observe We configure the session for .hd4K3840x2160. We iterate through AVCaptureDevice.formats and select formats that: have 3840×2160 resolution support ≥60 fps (videoSupportedFrameRateRanges) On some Pro devices, this format search returns no results, even though: The Camera app records 4K60 fine. External references list the wide camera as 4K60 capable. The fallback becomes the device's default 4K30 format, so final files are 3840×2160 @ 30 fps. This happens immediately on app launch (not after heating), so not thermal-related. What we’ve tried Force selecting .builtInWideAngleCamera instead of dual/triple cameras. Disabling HDR (videoHDREnabled = false). Disabling low-light boost. Allowing 59.94 fps formats (in case exact 60.0 isn’t exposed). Logging all videoSupportedFrameRateRanges per format. What we’re seeing in logs On affected Pro devices, the capture device reports only 4K formats with maxFrameRate ≈ 30 fps, despite the hardware being able to do 4K60. Main question Has anyone encountered cases where 4K60 formats are available in the Camera app but not exposed through AVFoundation, especially on Pro models or multi-camera devices? Could HEVC/HDR capability or multi-camera constraints be preventing certain formats from appearing? Are there known conditions where 4K60 formats are hidden unless specific device configuration is applied? Any guidance on reliably locking 4K60 on iPhone Pro models via AVFoundation would be hugely appreciated.
0
0
257
3w
【溦N51888M】腾龙公司如何注册游戏账号?
【溦N51888M】腾龙公司如何注册游戏账号?【罔纸 211239.com 】第一步:在登录/注册页面,找到“注册会员”按钮并点击。 第二步:进入注册界面后,填写用户名、密码进行注册账号,最后点击''注册”按钮。系统进行信息验证和处理,若一切顺利,将提示注册成功。
0
0
126
4w
Unfinished transactions prevent the confirmation sheet
We feel like we're at the end of the long and treacherous process of migrating to StoreKit2. But we've hit a small snag. When testing in the sandbox environment, we've found that if we don't finish a transactions, no subsequent purchase (invoked via call to purchase or the other purchase) will produce the confirmation sheet. Is this the expected behavior? The behavior is observed on iOS26 and 18. Our app will only attempt to finish the transaction if it successfully uploads the receipt to our API. If it fails to do so for whatever reason, the transaction is left unfinished. Whilst the user is informed about this, users will commonly try again. Our concern is that since the confirmation sheet will not be shown again, users will not know they are actually paying again - most certainly not the UX we want to have. We'd much rather have our users be fully aware when they're paying us money. The reason we're choosing not to finish the transaction until our backend has received it and confirmed the receipt to be valid is that the only way the user can get their product is if the server side is aware of this and add more time to the users account. When finishing the transaction via finish immediately after the purchase() call, the confirmation sheet is shown every time after subsequent calls to purchase(). Again, is this the expected behavior both in the sandbox and the production environments? Are we doing something wrong or misusing the product API? We are somewhat stumped because technically, we could get the first confirmation for a product purchase, and then finish it only after an arbitrary amount of calls to purchase() have been made - the user will believe they will have paid only once, but we will receive however much money we can drain from their account - most certainly not the kind of app we want to develop. Please advise and best regards, Emīls
0
3
151
3w
HKLiveWorkoutBuilder begincollection freezes in sim
When calling beginCollection on HKLiveWorkoutBuilder the function never completes and gets stuck. (On the second workout session, the first session works flawlessly) To reproduce: Run the MirroringWorkoutsSample on WatchOS https://developer.apple.com/documentation/healthkit/building-a-multidevice-workout-app. Start the workout and then end the workouts it should work perfectly fine the first time. Start the workout and end again, and you should see the problem, the workout doesn’t end.
0
0
40
3w
OCR using automator
OCR using automator How to make it more efficient and better? im using grok, to do it #!/usr/bin/env python3 import os import sys import subprocess import glob import re import easyocr import cv2 import numpy as np Verificar argumentos if len(sys.argv) < 2: print("Uso: python3 ocr_global.py <ruta_del_pdf> [carpeta_de_salida]") sys.exit(1) Obtener la ruta del PDF y la carpeta de salida pdf_path = sys.argv[1] output_folder = sys.argv[2] if len(sys.argv) > 2 else os.path.dirname(pdf_path) basename = os.path.splitext(os.path.basename(pdf_path))[0] output_prefix = os.path.join(output_folder, f"{basename}_ocr") Crear la carpeta de salida si no existe os.makedirs(output_folder, exist_ok=True) try: # Generar TIFFs *** pdftoppm, forzando numeración a 300 DPI result = subprocess.run( ["/usr/local/bin/pdftoppm", "-r", "300", "-tiff", "-forcenum", pdf_path, output_prefix], check=True, capture_output=True, text=True ) print("TIFFs generados:", result.stdout) if result.stderr: print("Advertencias/errores de pdftoppm:", result.stderr) # Buscar todos los TIFFs generados dinámicamente (prueba .tif y .tiff) tiff_pattern = f"{output_prefix}-*.tif" # Prioriza .tif basado en tu ejemplo tiff_files = glob.glob(tiff_pattern) if not tiff_files: tiff_pattern = f"{output_prefix}-*.tiff" # Intenta *** .tiff si no hay .tif tiff_files = glob.glob(tiff_pattern) if not tiff_files: print("Archivos encontrados para depuración:", glob.glob(f"{output_prefix}*")) raise FileNotFoundError(f"No se encontraron TIFFs en {tiff_pattern}") # Ordenar por número tiff_files.sort(key=lambda x: int(re.search(r'-(\d+)', x).group(1))) # Procesamiento de OCR *** EasyOCR para todos los TIFFs reader = easyocr.Reader(['es']) # 'es' para español, ajusta según idioma ocr_output_text = os.path.join(output_folder, "out.text") with open(ocr_output_text, 'a', encoding='utf-8') as f: f.write(f"\n=== Inicio de documento: {pdf_path} ===\n") for i, tiff_path in enumerate(tiff_files, 1): tiff_base = os.path.basename(tiff_path) print(f"--- Documento: {tiff_base} | Página: {i} ---") f.write(f"--- Documento: {tiff_base} | Página: {i} ---\n") # Leer y preprocesar la imagen img = cv2.imread(tiff_path, 0) if img is None: raise ValueError("No se pudo leer la imagen") _, thresh = cv2.threshold(img, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU) # Extraer texto *** EasyOCR result = reader.readtext(thresh) ocr_text = " ".join([text for _, text, _ in result]) # Unir *** espacios print(ocr_text) f.write(ocr_text + "\n") print(f"--- Fin página {i} de {tiff_base} ---") f.write(f"--- Fin página {i} de {tiff_base} ---\n") f.write(f"=== Fin de documento: {pdf_path} ===\n") # Borrar los TIFFs generados después de OCR for tiff in tiff_files: os.remove(tiff) print("TIFFs borrados después de OCR.") except subprocess.CalledProcessError as e: print(f"Error al procesar el PDF: {e}") print("Salida de error:", e.stderr) except FileNotFoundError as e: print(f"Error: {e}") except Exception as e: print(f"Error inesperado: {e}")
0
0
24
3w