-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Cmake big redesign #349
Cmake big redesign #349
Conversation
838bcf7
to
1506a4e
Compare
62809dd
to
03194ea
Compare
Few how-tos one need to try this outGnerally speaking, the requirement for dependencies is almost the same, so the standard After cloning the pull request, create a new folder
and you should be able to run Machinekit-HAL (so far without the |
What would be the best way how to structuralize complex managed HAL modules like the So one way how to do it is to create the The other way is to pull all without branching to the driver directory. The same goes for the |
looking at the Also, there is the part of the
Which in the end means that the Thank you. |
I’m sure the rsct.bin file could be generated at build time. I’ll have to dig up my notes on how I generated that, but it’s basically an empty resource table that is only necessary to convert the .bin files that are generated by pasm into an ELF format suitable for use with rproc. As far as I know the .bin files and .dbg files can be anywhere. The path is specified in the instantiation of hal_pru_generic, so configurations would need to be changed if the path changes. @cdsteinkuehler may have thoughts as well. |
The way I generated the rsct.bin file was based on this script: https://github.com/rogerq/pru-eth-firmware/blob/master/update_firmware.sh I believe I compiled a simple PRU program from the BeagleBone examples that can be executed with rproc and stripped out the resource table. I’m not very familiar with the ELF format, but there has to be a better way to get a basic resource table. |
Digging a little further, there's a resource_table_empty.h header file that provides the empty resource table: https://git.ti.com/cgit/pru-software-support-package/pru-software-support-package/tree/labs/Getting_Started_Labs/c_code/solution/am572x/resource_table_empty.h |
Thanks for the link! This looks a bit simpler. I have done some digging too. There is some example from Klipper - which uses the C interface and not the PASM - but maybe it would be possible to bend it. (Bit outside my area of expertise, so bear with me if I am way off.) Also, it seems that bits and pieces of the am335x_pru_package are hardcoded into the Machinekit-HAL repository from times of the Machinekit monorepo. Perusing the code, I so far cannot see if there were some changes on the Machinekit-side or if it just an old version of it, but I think the reasonable action would be to fork that repository into the Machinekit organization, write simple packaging for it (should not be hard) and turn it into build dependency, thus eliminate need for building the tool in each build and having this support code in the Machinekit-HAL repository. This would also make it easier when including changes from upstream. It also appears that when cross-compiling, the current build compiles the |
How about something like this? #357 |
As for the am335x_pru_package, I modified some of the code that was copied from that package in order to get support for remoteproc working on the Beaglebone AI (the uio_pruss driver doesn't work on the AI), but still use the uio_pruss driver on the Beaglebone Black. It seems like TI wants everyone to start using remoteproc rather than the uio_pruss driver. I also believe pasm isn't supported any more by TI. They encourage everyone to use gcc or clpru to compile C for the PRU. The PRU side of hal_pru_generic, though, is all implemented in assembly and porting it to C seems like a lot of work. Instead, I figured out how to convert the compiled output from pasm into an ELF file that is suitable to be started/stopped using remoteproc. So, all that said, I'm not sure how much benefit there is to splitting that stuff out since it's already been integrated into machinekit-hal and I doubt there will be any valuable upstream updates. |
I am trying to find the best possible solution to structuralize the current code into small, isolated pieces of code so that users don't have to go through the whole big codebase and everything looks pretty much uniform and approachable. The idea to put it inside its own package/repo was born from the fact that it is basically part of the toolchain (akin to GCC) and it would be easier to integrate upstream changes. If you say that TI basically killed the PASM, then the second part is passé and the first one is not just that important (it would need to be build to the side and the Toolchain file will have to include information both about the host and foreign compiler, but that is exactly what is happening now, so pretty much no logical change.) So, if you think this is the best solution (and as this endeavour is not about rewriting the code, just how it is structured and build), then I will take it. |
I do think there is tremendous value in breaking code up into more maintainable pieces. It would be nice, for example, to be able to work only on hal_pru_generic without needing to worry about compiling the rest of machinekit-hal. I developed a new HAL component bb_gpio that is an alternative to hal_bb_gpio and haven't bothered to integrate it into machinekit-hal because it's easier to work on it separately. So, I do think there could be value in breaking out pasm and other components, but it's a trade off between future time savings and how long it would take to split up and I don't have a good sense of how hard it is to break it out. |
The current status is that the acceptance tests (most of what is called Basic testing in Github Actions is also running, which found an error in the Ubuntu 18.04 Bionic when compiling the All in all, it is starting to look usable. Hopefully this time around, the curse will not strike. |
been trying to look into if and how can I build a toolchain/compiler which I can then use in following steps. And it seems like it is a problem in CMake. As the compiler needs to be available before configuration step when the So, from this, I think I am going to need the compiler pre-built before calling the CMake configuration. I don't presume you have an experience with this? |
Sorry, definitely not my area of expertise. |
5c47492
to
a8e9a70
Compare
…ged modules Change the directory structure of complex structure HAL managed modules.
Resolve problems preventing building Machinekit-HAL for different TARGET architectures via 'gcc' compiler. Use Debian Buster and up for actual testing.
CMake build-system is able to run the 'runtests' from the BUILD tree. Only the acceptance type of tests are working. All which require compilation and thus probably should not be part of 'runtests', but some other test suite, are currently not working. Not working are also the Python 'halmodule' test and the 'hostmot2' test. At the moment, 8 tests fail.
This commit repairs builds on Ubuntu 18.04 Bionic. Turns out the Ubuntu 18.04 has too old version of 'protobuf-protoc' compiler and had to be replaced with back-ported one from Ubuntu 20.04 Focal. This also adds work on tests running, lowering the failures to 5. (And really 2.)
Move the header files from 'HAL' library ('hal_api' INTERFACE library) into folder 'include/hal', which creates a namespace - for example 'hal.h' header should now be included as 'hal/hal.h'.
…y 'runtime' Namespace all C and C++ libraries under the 'src/libraries' mount-point. Specifically the 'runtime' one (RTAPI) using a higher level abstraction (parts that are a 'runtime', but for some or other reason are in multiple directories and namespaced 'runtime_*' are all in one 'runtime' namespace for imports).
Implement CPack based packaging for the 'libmachinekit-hal', 'libmachinekit-hal-dev' and 'modmachinekit-hal-components' on Debian systems via use of 'dh-cmake' debhelper package.
Implementation of the main set of Debian packages for executables, libraries, development files, modules and test-suites for Machinekit-HAL via the CPack. This does not implement the Python related parts.
Ubuntu 21.04 Hirsute started to return '-flto=auto -ffat-lto-objects' via 'dpkg-buildflags' in CFLAGS and CXXFLAGS and '-flto=auto' in LDFLAGS, thus activating the Link-Time Optimization. That started to clash with linker script which is hiding the local symbols in HAL managed modules. As a temporary solution, this commit removes aforementioned flags from build. New target 'setuid' was added to allow setting of setuid bit for 'rtapi_App', 'pci_write' and 'pci_read' when run from BUILD tree. Call with 'sudo make setuid' (when using Makefiles). Also override the dh_fixperms to do the same thing when building packages.
The CMake build process of PRU related components is twofold: Building the HAL managed drivers (MODULE libraries) and assembling the firmware code written in PRU Assembly language. (Represented by 'hal_pru_debug', 'hal_pru' and 'hal_pru_generic' on one side and 'pru_generic', 'pru_generic_bbai', 'pru_generic_direct_io_only', 'pru_generic_bbai_direct_io_only' and 'pru_decamux' on the other side.) This required implementing new ASM language for the TI PRU Assemble flavour in the CMake. The assembler itself (the PASM) is now distributed as a third-party dependency from Machinekit-HAL's Debian repository.
Implementation of Python packaging workflow using the CPack function and Python installer external executable. Implementation BeagleBone/Texas Instruments PRU package structure.
Add packaging dependencies for successful run of 'mk-build-deps', 'dkpg-buildpackage' on Debian based systems (Debian Buster, Debian Bullseye, Ubuntu Bionic, Ubuntu Focal, Ubuntu Hirsute). Modify the tests - both 'runtests' and 'pytests' - to run to completion during CI testing on Debian based systems.
Install CMake config script files to a base operating system under the `/usr/lib/<gnu-triplet>/cmake/Machinekit-HAL`mount-point for the Debian packaging. Implement missing COMPONENT files for selected targets. (As CMake does not take care of it automatically.) Make sure the 'usercomp.0' runtest is successful.
Because of a runtime bug, remove the 'oldstable' Debian 10 Buster from supported platforms for Machinekit-HAL.
Modify the README for the CMake based build-system.
Script 'git_watcher.cmake' from https://github.com/andrew-hardin/cmake-git-version-tracking project makes sure the Git SHA1 of the HEAD commit is always up-to-date in Machinekit-HAL's Runtime Config library.
Well, it's lot uglier than it should have been and it took lot longer then I anticipated, but that's life. (I made promise to myself to finish it before Christmas...) @the-snowwhite, ready for merge. CC @zultron as the original author of #200 (if he wants to do some code review). |
New implementation of the public DRONE CLOUD CI infrastructure for testing the Machinekit-HAL's CMake based build-system on ARM based systems. Removes the 'armhf' architecture as it was virtualized and not a real platform testing.
I re-implemented testing on Drone Cloud (the free tier for Open-Source Software projects) for Unfortunately, Drone Cloud sucks in a way that it now only allow one job to run at any given time. Even with limited OS versions and not running parts which already run in Github Actions, it still takes around 5 hours. 🤷 (Maybe put all It's better than nothing. |
The Drone Cloud testing successfully ran to completion - after almost 24 hours, I might add. It was plagued by networking error failing builds on git fetching, Debian package fetching and even simple Given this, the fact I still need to check the rest of the update workflow (namely the Debian repo and change in names), that it is Christmas and people are enjoying and that nobody was that interested in it to begin with, I am going to afford myself an exception and merge this. |
This is a
previewpull request intended for use as a discussion piece in (not only) Machinekit Matrix Room.