I have some c code that returns memory usage of a current task on my machine and recently redacted it to use the proc_getallinfio struct so I can instead retrieve systemwide memory usage. im calling that code in swift however im getting the error "Initializer 'init(_:)' requires that 'proc_taskallinfo' conform to 'BinaryInteger'" and im not sure what the appropriate field is to pass that works with proc_getallinfo struct. resident_size does not work in this context.
import IOKit
import Foundation
@_silgen_name("kernMem")
func kernMem(storeMemData: UnsafeMutablePointer <proc_taskallinfo>) -> kern_return_t
@main
struct MacStatAppApp: App {
@State public var printMemory: String = "" //dynamic state object to store data that will be passed to swiftUI
var body: some Scene {
WindowGroup {
ContentView(printMemory: $printMemory) //binding for printMemory to pass data to contentview
.onAppear {
var storeMemData = proc_taskallinfo() //define pointer
let result = kernMem(storeMemData: &storeMemData)
if result == KERN_SUCCESS {
let memoryUsage = Double(storeMemData) / (1024.0 * 1024.0 * 1024.0) //conversion for GB, 1024 to the power of 3
print(String(format: "memory usage: %.2f GB", memoryUsage))
} else {
print("failed to obtain memory usage data:\(result)")
}
}
}
}
}
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have an async function where im using the withUnsafeContinuation method and closure but for some reason Xcode is not letting me pass either 'error' or 'Error' for continuation.resume(throwing:). I get the error "Cannot convert value of type 'any Error'"
public func parseResponseData(data: Data) async throws -&gt; TextResult {
return try await withUnsafeContinuation { continuation in
Task {
do {
let decoder = JSONDecoder()
let result = try await decoder.decode(TextResult.self, from: data)
continuation.resume(returning: result)
} catch {
continuation.resume(throwing: error)
}
}
}
}
Im using Notions API to print out some data from one of my own pages in notion and im using URLSession to make the request then parsing the unwrapped data but nothing is being returned to my console and I know my endpoint and API key is correct. I've gone through the notion API documentation can't can't seem to find anything in it that I am not doing or doing wrong. Ill provide my code as well as the documentation I've been consulting: https://developers.notion.com/reference/intro
import Foundation
struct Page: Codable {
let id: String
let title: String
}
let endpoint = URL(string: "https://api.notion.com/v1/pages/8efc0ca3d9cc44fbb1f34383b794b817")
let apiKey = "… redacted …"
let session = URLSession.shared
func makeRequest() {
if let endpoint = endpoint {
let task = URLSession.shared.dataTask(with: endpoint) { data, response, error in
if let taskError = error {
print("could not establish url request:\(taskError)")
return
}
if let unwrapData = data { //safely unwrapping the data value using if let
do {
let decoder = JSONDecoder() //JSONDecoder method to decode api data,
let codeUnwrappedData = try decoder.decode(Page.self,from: unwrapData) //type: specifies its a struct, from: passes the data parmeter that contains the api data to be decoded
} catch {
print("could not parse json data")
}
}
if let httpResponse = response as? HTTPURLResponse {
if httpResponse.statusCode == 200 {
if let apiData = data {
print(String(data: apiData, encoding: .utf8)!)
}
} else {
print("unsuccessful http response:\(httpResponse)")
}
makeRequest()
}
}
task.resume()
}
}
I know this is a more abnormal question to ask on this forum but I really would like to gauge feedback from a community of other Swift developers on an idea. A colleague and I are seriously considering building a open-source web based SwiftUI component catalog that feature a collection of prebuilt modular components very similar to Shadcn for those of you who have worked in React but for SwiftUI where you can also contribute your own SwiftUI elements. Im curious if this would be something iOS/Swift devs would be possibly interested in and would love to hear thoughts on this idea. This is not a component library as Xcode technically has one thats built in, but rather fully built out SwiftUI elements such as UIs that use glass morphism for macOS, calendars for iOS apps, charts, etc.
i finally got previews for dynamic island to work and I'm just trying to first work on adding a static UI elements to my dynamic island like i did for my live screen live activity, but my dynamic island view is showing up totally empty, if i add my app icon image to the compact leading closure, it doesn't appear, if i ad text to an expanded region closure it doesn't appear.
am really stuck on this and would approeciate the help.
var body: some View {
Image("dynamicrep")
.resizable()
.scaledToFit()
.clipShape(.circle)
}
}
struct DynamicRepLiveActivity: Widget {
var body: some WidgetConfiguration {
ActivityConfiguration(for: DynamicRepAttributes.self) { context in
VStack {
HStack(spacing: 257) {
Text("from \(context.attributes.titleName ?? "no title")")
.fontWeight(.light)
.font(.system(size: 16))
.foregroundStyle(Color.gray)
Circle()
.frame(width: 53, height: 50)
.foregroundStyle(Color.gray).opacity(0.23)
.overlay {
Image("mmicon")
}
}
.frame(maxWidth: 500, maxHeight: 210)
Spacer()
Text("\(context.attributes.contentBody ?? "no content")")
}
.activityBackgroundTint(Color.cyan)
.activitySystemActionForegroundColor(Color.black)
.frame(width: 500, height: 300)
} dynamicIsland: { context in
DynamicIsland {
// Expanded UI goes here. Compose the expanded UI through
// various regions, like leading/trailing/center/bottom
DynamicIslandExpandedRegion(.leading) {
Text("from \(context.attributes.titleName ?? "no title")")
}
DynamicIslandExpandedRegion(.trailing) {
Circle()
}
DynamicIslandExpandedRegion(.bottom) {
Text("\(context.attributes.contentBody ?? "no content")")
}
} compactLeading: {
AppLogo()
} compactTrailing: {
Text("") //empty for now
} minimal: {
Text("hello") //empty for now
}
.widgetURL(URL(string: "MuscleMemory.KimchiLabs.com"))
.keylineTint(Color.white)
}
}
}
My start live activity CURL is not starting my live activity and I keep getting a decoding failure even though my curl matches my content state so my live activity is not starting. heres my CURL
--header "apns-topic: MuscleMemory.KimchiLabs.com.push-type.liveactivity" \
--header "apns-push-type: liveactivity" \
--header "apns-priority: 10" \
--header "authorization: bearer eyJhbGciOiJFUzI1NiIsImtpZCI6IjI4MjVTNjNEV0IifQ.eyJpc3MiOiJMOTZYUlBCSzQ2IiwiaWF0IjoxNzU3NDYwMzQ2fQ.5TGvDRk5ZYLsvncjKwXIZYN78X88v5lCwX4fRvfl1QXjwv8tOtO2uoId27LQahXA3zqjruu_2YoOfqEtrppKXQ" \
--data '{
"aps": {
"timestamp": '"$(date +%s)"',
"event": "start",
"content-state": {
"plain_text": "hello world",
"userContentPage": ["hello world"]
},
"alert": { "sound": "chime.aiff" }
},
"attributes-type": "KimchiKit.DynamicRepAttributes",
"attributes": {}
}' \
--http2 https://api.sandbox.push.apple.com/3/device/802fe7b4066e26b51ede7188a7077a9603507a0fa6ee8ffda946a864e75aa139602861538d6fb12100afbe9a3338d6c7c799d947dfacb2ee835f0339ecdc3165c9ed7e54839f5a3b89b76a011f5826cc
and here is my content state
public struct ContentState: Codable, Hashable {
public var plainText: String
public var userContentPage: [String]
public enum CodingKeys: String, CodingKey {
case plainText = "plain_text"
case userContentPage
}
public init(plainText: String, userContentPage: [String]) {
self.plainText = plainText
self.userContentPage = userContentPage
}
}
public init() {}
}
I’m fairly new to Xcode and swift. I’m building my first application. It’s a simple chat box GUI in swift that I want to use my python back end program to be able to send and receive requests. My python program is using the Fast API framework and I’ve set the port to 8080 in my swift GUI on the front end. when I try building my application in Xcode the build is successful but if I try to hit the send button in the simulator I get this error and I’ve tried everything to fix it.
2023-03-12 21:55:14.750406-0400 Mac GPT[75198:3546121] [connection] nw_socket_handle_socket_event [C1:2] Socket SO_ERROR [61: Connection refused]
2023-03-12 21:55:14.751307-0400 Mac GPT[75198:3546121] Connection 1: received failure notification
2023-03-12 21:55:14.751389-0400 Mac GPT[75198:3546121] Connection 1: failed to connect 1:61, reason -1
2023-03-12 21:55:14.751419-0400 Mac GPT[75198:3546121] Connection 1: encountered error(1:61)
2023-03-12 21:55:14.751550-0400 Mac GPT[75198:3546121] [connection] nw_connection_copy_connected_local_endpoint_block_invoke [C1] Client called nw_connection_copy_connected_local_endpoint on unconnected nw_connection
2023-03-12 21:55:14.751577-0400 Mac GPT[75198:3546121] [connection] nw_connection_copy_connected_remote_endpoint_block_invoke [C1] Client called nw_connection_copy_connected_remote_endpoint on unconnected nw_connection
2023-03-12 21:55:14.752257-0400 Mac GPT[75198:3546121] Task <0C6550E8-6F1C-42FD-9C78-2E25AF2DD4F9>.<1> HTTP load failed, 0/0 bytes (error code: -1004 [1:61])
2023-03-12 21:55:14.757622-0400 Mac GPT[75198:3546668] Task <0C6550E8-6F1C-42FD-9C78-2E25AF2DD4F9>.<1> finished with error [-1004] Error Domain=NSURLErrorDomain Code=-1004 "Could not connect to the server." UserInfo={_kCFStreamErrorCodeKey=61, NSUnderlyingError=0x600003a23a80 {Error Domain=kCFErrorDomainCFNetwork Code=-1004 "(null)" UserInfo={_NSURLErrorNWPathKey=satisfied (Path is satisfied), viable, interface: lo0, dns, _kCFStreamErrorCodeKey=61, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask <0C6550E8-6F1C-42FD-9C78-2E25AF2DD4F9>.<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask <0C6550E8-6F1C-42FD-9C78-2E25AF2DD4F9>.<1>"
), NSLocalizedDescription=Could not connect to the server., NSErrorFailingURLStringKey=http://192.168.1.155:8080/MacGPT, NSErrorFailingURLKey=http://192.168.1.155:8080/MacGPT, _kCFStreamErrorDomainKey=1}
Error:Error Domain=NSURLErrorDomain Code=-1004 "Could not connect to the server." UserInfo={_kCFStreamErrorCodeKey=61, NSUnderlyingError=0x600003a23a80 {Error Domain=kCFErrorDomainCFNetwork Code=-1004 "(null)" UserInfo={_NSURLErrorNWPathKey=satisfied (Path is satisfied), viable, interface: lo0, dns, _kCFStreamErrorCodeKey=61, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask <0C6550E8-6F1C-42FD-9C78-2E25AF2DD4F9>.<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask <0C6550E8-6F1C-42FD-9C78-2E25AF2DD4F9>.<1>"
), NSLocalizedDescription=Could not connect to the server., NSErrorFailingURLStringKey=http://192.168.1.155:8080/MacGPT, NSErrorFailingURLKey=http://192.168.1.155:8080/MacGPT, _kCFStreamErrorDomainKey=1}
if it helps I can also show my swift code thanks for the help.
I’m fairly new to Xcode and swift. I’m building my first application. It’s a simple chat box GUI in swift that uses ChatGPT via openai's api that I've imported into my backend python program. I want to use my python program to be able to send and receive requests in the SwiftUI chat-box GUI. My python program is using the Fast API framework and I’ve set the port to 8080 in my swift GUI on the front end. when I try building my application in Xcode the build is successful but if I try to hit the send button in the simulator I get this error and I’ve tried everything to fix it.
The said error that appears in the Xcode console(I shortened the error message for simplicity):
"2023-03-12 21:55:14.750406-0400 Mac GPT[75198:3546121] [connection] nw_socket_handle_socket_event [C1:2] Socket SO_ERROR [61: Connection refused]"
Below is my swiftUI code for the front end chatbox GUI:
import SwiftUI
struct ContentView: View {
@State private var text: String = ""
func sendRequest() {
guard let url = URL(string:"http://"MY MACHINES IP ADDRESS HERE":8080/MacGPT") else {
return
}
var request = URLRequest(url:url)
request.httpMethod = "GET"
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
let parameters = ["text" : text]
request.httpBody = try? JSONSerialization.data(withJSONObject: parameters , options: [])
URLSession.shared.dataTask(with: request) { data, response,error in if let error = error { print("Error:(error)")
return
}
guard let data = data else {
print("Data not found")
return
}
if let response = String(data: data, encoding: .utf8){
print("Response: (response)")
}else{
print("Invalid respnose type")
}
} .resume()
}
var body: some View {
HStack {
VStack(alignment: .center, spacing: 20) {
Image(systemName: "globe")
.foregroundColor(Color.blue)
.font(.system(size: 30))
Text("Access the power of AI right from your Mac's homescreen, just for Mac.")
.font(Font.custom("Futura", size: 15))
.fontWeight(.bold)
HStack {
TextField("Ask Mac GPT...", text: $text)
.font(Font.custom("Futura", size: 13.4))
.fontWeight(.bold)
.padding(.horizontal, 5)
.padding(.vertical, 13)
.background(Color.white)
.cornerRadius(29)
Button(action:sendRequest)
{
Image(systemName: "arrow.up.circle.fill")
.foregroundColor(Color.blue)
.frame(width: 50, height: 45 )
.font(Font.system(size: 35))
}
.buttonStyle(PlainButtonStyle())
}
.padding()
.background(Color.white.opacity(0.9))
.cornerRadius(50)
.padding(.horizontal, 20)
.padding(.bottom, 70)
}
.frame(minWidth: 900, maxWidth: 6000, minHeight: 600, maxHeight: 7000)
.background(Color.white.opacity(0.1))
.cornerRadius(29)
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
}
And below this is my Python program with Fast API that I want to add into my swift Xcode project:
import os
import fastapi
import openai
import tkinter as tk
from fastapi import FastAPI, Request
import uvicorn
import requests
import socket
app = FastAPI(max_request_size = 100000000)
API_KEY = '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'
engine = "text-davinci-003"
os.environ['AI_KEY'] = API_KEY
openai.api_key = os.environ['AI_KEY']
@app.get("/")
async def handle_requests():
return{"demo"}
async def MacGPT(request : Request):
print(f"max request size: {request.app.max_request_size}")
data = await request.json()
userMessage = data['userMessage']
response = openai.Completion.create(engine = engine, prompt = userMessage, max_tokens = 200)
result = response ["choices"][0]["text"]
return {"result" : result}
@app.post("/MacGPT")
async def MacGPT(request : Request):
data = await request.json()
userMessage = data['userMessage']
response = openai.Completion.create(engine = engine, prompt = userMessage, max_tokens = 200)
result = response ["choices"]['0']["text"]
return {"result" : result }
def submit_callback():
prompt = user_input.get()
response = openai.Completion.create(engine = engine, prompt = prompt, max_tokens = 200)
result = response["choices"][0]["text"]
result_label.config(text=result)
root = tk.Tk()
root.title("Mac GPT")
user_input = tk.Entry(root)
user_input.pack()
user_submit_button = tk.Button(root, text="Send", command = submit_callback )
user_submit_button.pack()
result_label = tk.Label(root, text="")
result_label.pack()
root.mainloop()
Im newer to Swift and Swift UI but im building a simple ai powered chatbox but the button that I have in the UI, when I try to move it back into position using the X,Y offset modifiers, the button gets partially hidden in the body of the UI as can be seen in the bottom right of the picture provided.
func makeTextFieldAndButton() -> some View {
HStack {
Spacer()
TextField("Ask Mac GPT...", text: $text)
.font(Font.custom("Futura", size: 17.4))
.fontWeight(.bold)
.padding(.horizontal, 25)
.padding(.vertical, 15)
.background(Color.white)
.cornerRadius(29)
.overlay(
RoundedRectangle(cornerRadius: 27.9).stroke(Color.gray, lineWidth: 1.2)
)
.offset(y: -120)
Button(action: {
let process = Process()
process.executableURL = URL(fileURLWithPath: "/usr/bin/swift")
process.arguments = ["/Users/alexhaidar/Documents/Developer/Mac GPT/serverSide.swift", filePath]
try? process.run()
}) {
ZStack{
Spacer()
Image(systemName: "arrow.up.circle.fill")
.foregroundColor(Color.blue)
.font(Font.system(size: 40))
.buttonStyle(PlainButtonStyle())
.padding(.top)
.offset(y: max(-12, -120 + (text.isEmpty ? -10 : 0)))
.offset(x: max(-21, text.isEmpty ? -21 : 0))
}
}
.overlay(
RoundedRectangle(cornerRadius: 27.9)
.stroke(Color.clear, lineWidth: 1.2)
)
.background(Color.clear)
.offset(x: 21)
}
}
}
This is my first Xcode application, I'm building a simple MacOS chatbox application that uses python scrips, PythonKit, and swift to handle serverSide operations and accessing open's api and is meant to trigger these two methods I have in a function called executeProcess() that is meant to invoke other functions in another file when a question is types in the text field and the 'enter' key on a keyboard is hit via the onCommit function, however im getting no console output. here is my relevant code from my contentView.swift file I can provide more code from other files if needed.(I will just be showing the non SwiftUI specific code here)
import Cocoa
import Foundation
import PythonKit
import AppKit
protocol runPyRunnable {
func runPyServer(completion: @escaping(String) -> Void)
func sendRequest(userInput: String, completion: @escaping(String) -> Void)
}
func runPyServer() -> String {
print("server run")
return "server run"
}
struct MyPyTypePlaceHolder: runPyRunnable {
func runPyServer(completion: @escaping(String) -> Void) {
}
func sendRequest(userInput: String, completion: @escaping (String) -> Void) {
}
}
struct ContentView: View {
var ViewController: runPyRunnable? = MyPyTypePlaceHolder() as? runPyRunnable
@State private var text: String = ""
@State private var filePath = ""
@State private var inputText = ""
var body: some View {
makeContent()
.onAppear{
NSApp.mainWindow?.makeFirstResponder(NSApp.mainWindow?.contentView)
}
}
ZStack {
Spacer()
TextField("Ask Mac GPT...", text: $inputText, onCommit: {
executeProcess(withInput: inputText) { response in
print(response)
}
})
.font(Font.custom("Futura", size: 17.4))
.padding(.horizontal, 25)
.padding(.vertical, 15)
.background(Color.white)
.cornerRadius(29)
.overlay(
RoundedRectangle(cornerRadius: 27.9).stroke(Color.gray, lineWidth: 1.0)
)
.offset(y: -200)
.padding(.horizontal, 35)
}
}
func executeProcess(withInput input: String, completion: @escaping (String) -> Void ) {
DispatchQueue.global().async {
DispatchQueue.main.async {
guard !input.isEmpty else {
print("TextField is empty, enter input in the text field")
return
}
if let myPyTypeInstance = self.ViewController {
myPyTypeInstance.runPyServer { responseFromRunPyServer in
myPyTypeInstance.sendRequest(userInput: input) { responseFromSendRequest in
completion(responseFromSendRequest)
}
}
}
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
}
in anticipation for the action button that will be coming on the iphone 15pro, am I able to build in functionality and support that takes advantage of the action button for my app in Xcode? I know there are the few 3rd party apps for watchOS that have built in support for the action button on the Apple Watch ultra but I wanted to get more information on this from other developers.
Im creating a simple chatbox using an api caller library I created and imported but it looks like Xcode is not recognizing the modules as I get multiple "no member" errors for the 'ChatClient' module.
`import SwiftUI
import openaiLibrary
final class ViewModel: ObservableObject {
private var openAI: openaiLibrary.ChatClient
init(apiKey: String) {
let config = openaiLibrary.ChatClient.OpenAIEndpointProvider.makeDefaultKey(api_key: apiKey, endpointProvider: openaiLibrary.ChatClient.OpenAIEndpointProvider())
self.openAI = openaiLibrary.ChatClient(apiKey: apiKey, openaiEndpoint: config.baseURL)
}
public func sendAIRequest(with search: String, completion: @escaping(Result<String,Error>) -> Void) {
openAI?.sendCompletion(with: search) { result in
switch result {
case .success(let response):
if let text = response.choices.first?.text {
completion(.success(text))
} else {
completion(.failure(NSError(domain: "error", code: 1, userInfo: [NSLocalizedDescriptionKey: "No response found"])))
}
case .failure(let error):
completion(.failure(error))
}
}
}
struct ContentView: View {
var body: some View {
VStack {
Image(systemName: "globe")
.imageScale(.large)
.foregroundStyle(.tint)
Text("Hello, world!")
}
.padding()
}
}
#Preview {
ContentView()
}
}
`
I can also provide my source code for my api caller that my openaiLibrary package dependency uses to make sure everything is defined correctly so that Xcode recognizes everything, due to character constraints I wasn't able to fit it in this post.
I have a bunch of JSOn data printed to my console but I want that data to be displayed in my UI on the iOS simulator and content view UI. ill show the class with my instances and property wrapper im using aswell as the function thats calling my other code that displays the data in the console
@Published var makeapirequest: String = "" //make sure instance is correct
func makeAPIRequest() {
makeRequest()
}
}
@StateObject public var NotionCaller = NotionCall() //manage lifecycle of instance
and then below is the Text() line I added the display the data aswell as the onAppear modifier
NotionCaller.makeAPIRequest()
}
Text(NotionCaller.makeapirequest)
im trying to display the data by calling the instance of the function that calls makeRequest() but the data is still being displayed in the console rather than in my swiftUI.
my dynamic island UI is triggering as empty when i send my curl, this is a pushToStart run push driven live activity and when i send my curl this is what appears, despite be being able to render the UI through a local push no problem, here is my curl.
curl -v \
-H "apns-topic: MuscleMemory.KimchiLabs.com.push-type.liveactivity" \
-H "apns-push-type: liveactivity" \
-H "apns-priority: 10" \
-H "Content-Type: application/json" \
-H "authorization: bearer eyJhbGciOiJFUzI1NiIsImtpZCI6IjI4MjVTNjNEV0IifQ.eyJpc3MiOiJMOTZYUlBCSzQ2IiwiaWF0IjoxNzU4ODU2MDkyfQ.i83VbgROsxEzdgr512iQkVsp0FjHIoHq2L6IB2aL1fImJgX-XM6TM5frNnVyfva7haMd9fDGjO2D_wfCq8WnBg" \
--data '{
"aps": {
"timestamp": '"$now"',
"event": "start",
"content-state": {
"plain_text": "hello world",
"userContentPage": ["hello world"]
},
"attributes-type": "KimchiKit.DynamicRepAttributes",
"attributes": {
"activityID": "12345"
},
"alert": {
"title": "Workout started",
"body": "We’ll show your reps on the Lock Screen.",
"sound": "default"
}
}
}' \
--http2 https://api.sandbox.push.apple.com/3/device/80d50a03472634d9381b729deec58a3e250ea0006b7acd7c2d6ef19e553dcdb010eb1434ff9a6907380f6ed3e9276d57d58f3cda3ac9fc3bea67abae116601a63ec77a34174fd271c4151ec898abae30
and heres my content state which resides in a shared module
@available(iOS 17.0, *)
public struct DynamicRepAttributes: ActivityAttributes, Codable {
public struct ContentState: Codable, Hashable {
public var plainText: String
public var userContentPage: [String]
public enum CodingKeys: String, CodingKey {
case plainText = "plain_text"
case userContentPage
}
public init(plainText: String, userContentPage: [String]) {
self.plainText = plainText
self.userContentPage = userContentPage
}
}
public var activityID: String
public init(activityID: String) {
self.activityID = activityID
}
}
Ive also alr verified my attributes type is correct, have been stuck on this issue would really appreciate the help