-
Notifications
You must be signed in to change notification settings - Fork 218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build fails on aarch64 hosts #747
Comments
Update: I tried the build with Ubuntu 22.04.4 LTS (Jammy Jellyfish) and am getting the same error. (I changed the title of the issue to reflect this.) |
Hi @msgilligan, The build works for me with Ubuntu 22.04 (22.04.4 LTS) using the Dockerfile at https://optee.readthedocs.io/en/latest/building/prerequisites.html. It looks like for some reason you build is not picking up the proper cross-compiler. Could you try building with |
Yes, when I run the same I will try your suggestion and report back. |
Here is the result:
|
I was able to successfully build the project in another Debian 12 environment and I compared the Of course, the code in |
@jforissier My goal is to be able to develop for OP-TEE on my M1 MacBook Pro. I have Docker installed and use Lima for long-lived development VMs. I prefer to use Debian (Bookworm) over Ubuntu, but am willing to use Ubuntu if necessary. But I do not want to use amd64/x86_64 emulation (builds take many hours), I want to run native arm64 VMs. I am willing to troubleshoot, document, and submit PRs to help make this happen. I can also move this to the mailing list, if that is a better venue. I have verified that the reference
Although this is different from the symptom I initially reported, it seems like the first place to start, because in the case of a Docker build the environment is more carefully controlled and I'm seeing a difference in behavior based (apparently) solely on the CPU architecture. I've also seen this problem (or similar) in builds inside VMs and on my Intel box. Under Docker the build seems to occur in the root directory, so From my perspective, It seems like there are at least two issues/todos to work on:
On a separate (and positive!) note, when I do run the Docker build using amd64 and use Docker to copy I should also mention that when I did preliminary investigations in October 2023, I was able to do the full build in an arm64 VM on macOS and run |
p.s. Using |
Hi @msgilligan, Being able to build natively on arm64 is a good thing obviously, and as you said it used to work. It should not be that hard to fix the issues and I appreciate your willingness to help in that regard.
That should be
Yes, it was introduced by f0a2eef. Basically it takes its value from
Does |
This documentation page shows
Yes. If I set
|
Ah, yes. The doc needs updating.
C++ support in TAs does not work with GCC 12.3. "make toolchains" is supposed to install 11.3, how does your setup end up pulling something from host-gcc-final-12.3.0?
|
|
I'm not sure. The only difference in my setup (I'm running a slightly modified version of the example
Will do. By the way, with the following make line the build completes successfully:
(So disabling Rust and removing the |
This works as well:
and so does:
|
I opened Issue #749 for the specific problem of the Rust toolchain not being downloaded on aarch64 hosts and I've submitted a "draft" patch that I believe addresses the issue. And PR #748 seems to solve the download part of the issue, but there is still a separate problem with actually building the Rust examples. I'll leave this issue open because I think we should probably also make an issue for the C++ tests not working on aarch64 hosts. |
@jforissier asks:
Looking at
So that seems to answer the question. So maybe we need to adjust the build root configuration to pull/build the compatible version? Update: I created Issue #751 for this specific problem as there are at least three sub-issues we've found on this issue. |
I have been unable to reproduce the original error (which I saw on x86_64) reported in this issue, but there are three aarch64 sub-issues now:
I also opened apache/incubator-teaclave-trustzone-sdk#135 in the Teaclave Rust repository that references these issues, so the Rust developers are made aware -- maybe they have some ideas. We should probably do one or more of the following:
@jforissier What do you suggest? |
Done. |
This issue has been marked as a stale issue because it has been open (more than) 30 days with no activity. Remove the stale label or add a comment saying that you would like to have the label removed otherwise this issue will automatically be closed in 5 days. Note, that you can always re-open a closed issue at any time. |
Not stale! We need to fix this! |
@msgilligan, what's the current status now that #764 has been merged? |
This comment identified 3 aarch64-host-related issues. At this point, only the last of those issues remains: Issue #752 in this repo (I also created #747 (comment) over on the Rust project to make sure they are aware of it) Please see apache/incubator-teaclave-trustzone-sdk#135 (comment) which links to a gist with a Containerfile created that works on AMD64 but fails on ARM64. I am hoping someone with more experience than me with Rust cross-compilation and the custom linking options needed for TAs could take a look at it. |
Thanks for the update. |
I'm feeling like I misread this question 2 weeks ago. I don't think I was aware of that particular PR and thought you were asking about a previous one that was merged. It looks like there have been multiple commits related to this issue merged, so I will try to build with the latest |
OK, I've verified that the current master compiles successfully on aarch64 Linux (Debian 12 inside QEMU on macOS.) But if I set
I suppose we could close this issue now that we're down to apparently just #752. Although we might want to open another issue for getting the C++ tests working on aarch64. (Personal note: I am trying to develop a TA in Rust, so personally I'm much more concerned about the Rust build issue than the C++ one.) |
I'm attempting to build on a Rpi3 for research purposes. Everything is great until I run the command make toolchains. I noticed that my aarch32 and aarch64 folders do not populate with the proper files. My error seems related to what's happening here. I'm receiving errors related to aarch64 that wasn't solved by altering the Rust value, like what is suggested above. Any help is appreciated.
I don't plan on using this build past 2038, so unless necessary I won't be fixing. But I also receive the following errors: |
Which OS are you using? Is it a 32-bit or 64-bit version? What is the list of packages you have installed with Most OP-TEE developers cross-compile on AMD64 (x86) linux. I've been compiling on ARM64 in a Debian VM on Apple Silicon. Building on a Pi 3 will be very slow, but I think we should try to fix any issues that occur. If you have a Pi 4 or Pi 5, I would try using that for the build simply because it will be faster. |
@msgilligan I followed OP-TEE Prerequisits I've got a Pi5, but used a 3 because the docs indicate that is what is supported. My Pi3 is armv7l and 32-bit. |
@aElizaF You didn't fully answer this question. Are you using Raspberry Pi OS, Debian, etc? Raspberry Pi OS comes in 32-bit and 64-bit versions (with the 32-bit version being recommended for "most users" by Raspberry Pi) can you let us know whether you are running the 32-bit or 64-bit version and whether your using "bookworm" or "bullseye", etc. I strongly recommend using the 64-bit Bookworm version for builds. (And running it on your Pi5 which is faster and has more memory.) I have successfully built an OP-TEE disk image for rpi3 using Debian Aarch64 Bookworm in a VM on my MacBook Pro. You do not need to run the build on the same machine you want to run OP-TEE on. Use an x86_64 Linux if you have one, or (after making sure you have 64-bit Raspberry Pi OS installed) try it on your Pi5. The sequence I used to complete the build was:
Note that I had two issues getting this to work under Debain Aarch64:
There is probably a better fix than making warnings non-fatal, but this change got the build working (and the SDCard image I build boots and passes
|
I created a new issue with the 3 fixes/workarounds for building for rpi3 on Debian Aarch64: #775 |
I just re-read this and I think this means you're running a 32-bit OS. The Pi3 has a 64-bit capable processor, but I'm guessing you're running 32-bit OS and used |
My apologies, I'm using raspbian on the 3. I'll build out the rpi5. I only used the 3 since that was indicated as having support and the 5 didn't. I'll report back in a few days - I haven't been to my lab, so I haven't worked on this. |
@msgilligan The Rpi5 Works! I used Ubuntu 24.04, and realized I need a bigger storage, as the 64GB ran out of space. I didn't realize OPTEE needed that much disk space. Once I get a larger storage, I'll continue on. |
OP-TEE OS un-stripped is less than 5MB and the TA's are very small as well. Usually a few hundreds of kB. Driver for it and user space client code, tee-supplicant adds a few more MB. So, it's not "OP-TEE" that takes disk space. When it comes to the disk space used when building our reference builds, they tend to vary a bit. Both my QEMU v8 and RPi3 lands on 15GB each when everything has been built. It's typically buildroot, toolchain(s), Linux kernel that takes most disk space. |
This issue has been marked as a stale issue because it has been open (more than) 30 days with no activity. Remove the stale label or add a comment saying that you would like to have the label removed otherwise this issue will automatically be closed in 5 days. Note, that you can always re-open a closed issue at any time. |
Update: The x86_64 build issue turned out to be non-reproducible, so this issue has been renamed. The actual problem I have been troubleshooting is building on aarch64 hosts and there are at least 3 sub-issues that have been discovered. This issue now serves as a parent issue for aarch64 host build issues.
On Debian 12 (x86_64), I'm following the standard build instructions for QEMU v8, e.g.
The error I'm getting is:
The text was updated successfully, but these errors were encountered: