Adam Lackorzynski adam@os.inf.tu-dresden.de writes:
In a couple of sentences? Give me a try.
No, actually I thought it was perhaps much to ask, that's what I meant. As for me, if you don't mind writing several pages, that would be great :)
When you compile a program on your desktop, you need quite a bit of support functionality, such as some libraries, linker scripts etc. Those you just install and then it works. If you want to compile a program for another OS you this need support functionality as well. And that's what you have built.
So building (compiling and linking + some extra glue) is that "OS intense" even with the same compiler (and associated tools), the same language, and, not the least, on the same architecture?
Also, is the "support functionality" persistent (not application/source dependent), i.e., will the process be much faster the next time I do it? (Actually I'll try this in one second, but I ask anyway.)
What is the rationale for all those trees with Makefiles on each level, not seeming to do much (to the untrained eye, though I know the syntax) while sometimes they are *immense*?
Except for the usual stuff (target, libraries, and sources), there are a couple of recurring things, like
include $(L4DIR)/mk/prog.mk
(and more) that perhaps would make more sense (to me) if you could provide me with a "hit list" what must be done to build an application. If I know "what" must be done, I think I could identify the "how" by just looking at other Makefiles.