this post was submitted on 21 Oct 2023
4 points (100.0% liked)
Programming Languages
1159 readers
1 users here now
Hello!
This is the current Lemmy equivalent of https://www.reddit.com/r/ProgrammingLanguages/.
The content and rules are the same here as they are over there. Taken directly from the /r/ProgrammingLanguages overview:
This community is dedicated to the theory, design and implementation of programming languages.
Be nice to each other. Flame wars and rants are not welcomed. Please also put some effort into your post.
This isn't the right place to ask questions such as "What language should I use for X", "what language should I learn", and "what's your favorite language". Such questions should be posted in /c/learn_programming or /c/programming.
This is the right place for posts like the following:
- "Check out this new language I've been working on!"
- "Here's a blog post on how I implemented static type checking into this compiler"
- "I want to write a compiler, where do I start?"
- "How does the Java compiler work? How does it handle forward declarations/imports/targeting multiple platforms/?"
- "How should I test my compiler? How are other compilers and interpreters like gcc, Java, and python tested?"
- "What are the pros/cons of ?"
- "Compare and contrast vs. "
- "Confused about the semantics of this language"
- "Proceedings from PLDI / OOPSLA / ICFP / "
See /r/ProgrammingLanguages for specific examples
Related online communities
- ProgLangDesign.net
- /r/ProgrammingLanguages Discord
- Lamdda the Ultimate
- Language Design Stack Exchange
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You will need a hardware description language (HDL). Verilog and VHDL are two very established ones but they are tricky. A newcomer is Bluespec, which is now open source so if you wanna go down that rabbit hole, I'd recommend this one.
Sorry, I meant a general language rather than one that is described as a "hardware description language". I went ahead and edited the post to be more clear about that.
Thanks for the info though as others might still be interested in it!
The most low level languages, such as C, compiles down to CPU instructions, which still is way above logic gates. The CPU in turn reads the instructions and controls the computer to in a way "simulate" what could be described as a boolean expression -- at every CPU clock cycle. The next cycle the permutation of all control signals and computer compinents will be different. I highly doubt any programming language implementation has an IR that resembles what you are looking for, including mathematica. The closest you get is probably HDLs but then you need to do all the mathing yourself
And with so much stuff being built ontop of C (or at minimum LLVM) I was afraid that would be the case.
I was kinda hoping there would be some hacky compiler that could take a C function like:
a
andb
, and an boolean outputs forc
I can highly recommend you have a look at some HDL languages, eg Verilog can look roughly like your example and synthesizes down to logic elements
Cool I'll give them I shot! I'll admit I haven't heard great things about typical HDL langs, so I haven't looked into them much
Another way could be to run that through a compiler with optimization activated, and then decompile the resulting binary back to code. But if you want to optimize hot code then usually mathematical reduction is seldomly wherein the problem lies
I don't know about math reduction not being the bottle neck. If I was custom optimizing hot code then yeah, cache hit optimization is huge, but I'm thinking of generic optimizations on hot code that only the compiler looks at. Beyond out-of-order-execution and SIMD kind of algebraic shuffling. For example, I want to be confident that the compiler would transform something like
for each in range(x) x += x
into
x*=x+1
And based on stuff like this (which is shockingly recent IMO) I don't think modern compilers can even find that shortcut right now. Which is kinda sad when you think about it.
If x=65536, any non-algebraic optimizaiton would be vastly inferior. And sure an experienced dev wouldn't make this kind of mistake, but I bet half the code running on the average computer wasnt written by experienced devs. And its not like its an either-or situation, we can do both optimization steps.
Another newcomer is Amaranth HDL which might be more approachable and transpiles to VHDL