-
Notifications
You must be signed in to change notification settings - Fork 140
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Blank outputs generated on AIX #1104
Comments
@kunal-vaishnavi @snnn @tianleiwu Config file wise, genai_config and tokenizer config are in json format , so these required files are getting loaded properly, Below dump doesn't look good .
|
What ORT version do you use? Does ORT itself have official support for AIX? Most of the heavy-lifting work is handled in ORT. I honestly don't know about the status of AIX support of ORT. |
Did more debugging to see whether external data (model layers weight) is loaded/parsed properly or not.
@snnn @tianleiwu @kunal-vaishnavi @prajwal-ibm |
Yes. In AIX, only inference support is available for ORT. |
generator.get_next_tokens() always returns zero
model used : phi3-mini-128k-instruct-cpu-int4-rtn-block-32
platform.platform()
'AIX-3-00C001034C00-powerpc-64bit-COFF'
platform.processor()
'powerpc'
platform.system()
'AIX'
platform.version()
'7'
Packages versions :-
onnx 1.16.2
onnxruntime 1.20.0
onnxruntime-genai 0.5.0
Thank you in advance!!
The text was updated successfully, but these errors were encountered: