Show newer

@gassahara@mstdn.io @amiloradovsky@functional.cafe Look at em trying to dodge bugs!
gcc.gnu.org/bugzilla/show_bug.
Good thing there's always the friendly neighborhood standard expert to back you up.

And again you disregard us poor humans, while I'm trying to convince you there is nothing else. There is only us. Abstractions are not just nice for us there are our only choice to do anything, and we are doing everything that there is in this industry. That's why good tools to write good abstractions are important. For example, I often see indirection or runtime type information used in C due to lack of proper type system or generics. Do you want userdata without indirection? Do you want callbacks without indirection? C++ type system and generics got you covered. And there is much more in the same vein. With C there is general tendency to try to break away from the limitations of the language by pushing abstraction out of it, either to the runtime or to external tools (like code generators). You shouldn't be generating code, compilers generate code, become a compiler developer instead, add your language extension to gcc (or llvm, if you are a pussy) then try to standardize it and get roasted. Do good for the humanity (likely not approved by your current employer) not for your job security, by writing code no one else can read. Don't do it out of altruism but out of professional pride, which would be repeatedly beaten out of you by the standard's committee.

Sure be aware of your system, provide a proper initial guess and number of iterations for Babylonian method that makes sense for your system or use case, but do not implement the basic algorithm over and over again, getting it wrong half the time. I would even say write a compile time function to compute the initial guess, instead of doing it on paper and leaving a mysterious magic number behind. If the compiler does not optimize your obvious expressive abstraction, become a compiler developer again, or nag them until they fix it.

For every barely exploitable spectre, there are probably a dozen of embarrassingly exploitable software vulnerabilities, not to mention that it is our culture of writing hardware specific code that drove the hardware engineers to the overcomplicated designs that cause these problems, forcing them to optimize old instruction sets instead of developing new ones. Not our fault for doing that initially, is our fault for sticking with it till today, and planning to stick with it in the future.

Yet even in our current climate the fact that the compilers (written by humans) can beat people at optimizing certain common abstractions is a testament to the effectiveness of the approach. They are even adding GPU backends for them. I for one am waiting for a new wave of verilog backbends.

@gassahara@mstdn.io Compiler writers are also human! Intention of not only the standard but of programming languages (and mathematics) is primarily human to human communication. You need to be able to read what I wrote, otherwise it's useless. Yes It's not a teaching material, it's a reference, but the agreement on it is where the portability comes from fundamentally. It doesn't seem human only because it's a very big relationship.

Stroustrup's friend Stepanov also convinced me that C++ style generics is the way to go with programming in general (hehe, "general").

@amiloradovsky@functional.cafe Yes reailty is imperfect, and we must yell at it, so that others hear us and be convinced. That is all that matters.

Coq still seems way beyond my comprehension, too many steps the purpose of which is unclear to me. When it comes to formal systems I think mathematicians have a lot more work to do there first, maybe using Coq.

@amiloradovsky@functional.cafe @gassahara@mstdn.io
"The mpn functions are designed to be as fast as possible not to provide a coherent calling interface", Doesn't sound very promising, though I wouldn't be surprised if GMP has very similar abstractions to what I'm talking about here. I'm saying a better version of those lowest level abstractions should be standard. I would even strive to make the operators work with that.

Coq I'm not familiar with, but It didn't seem very generic to me from your posts, sounded like a very specialized environment, where you can allow yourself to (or even have to) ignore a whole lot of reality. I don't want to make my own perfect world, I want to code to a standard. Rather than reinventing all the wheels, my time would be better spend writing one of those completely ignored proposals for the c++ standard, if I ever get around to it.

@gassahara@mstdn.io you completely misinterpret my stance it seems. Minimal efficient abstraction is my goal as well, throughout mathematics hardware and software, I just don't draw any arbitrary lines. Behind the hardware design that you try to optimize for today is a programmer like you. Tomorrow you should both be engineers, working together and not plating blind catch-up with each other.

@gassahara@mstdn.io @amiloradovsky@functional.cafe not the nail polish guy!

@gassahara@mstdn.io @amiloradovsky@functional.cafe are we listing youtube lectures now?
efficient programming with components - Alexander Stepanov
programming conversations - Aleander Stepanov
everything else you can find(and let me know) by Alexander Stepanov

@gassahara@mstdn.io @amiloradovsky@functional.cafe the sqrt is another example of horrible interface. There is no such thing as square root, there is no known arithmetic that can compute it, there is no known algebra that works with it. There is a Babylonian method for example which naturally requires an initial guess, and a terminating condition (precision check or number of iterations). This interface cleanly accommodates the sqrt trick, and many other so called optimizations of square root. I don't want you to not be able to tweak the parameters, I want your tweaking to be legible and self documenting.

I don't think integer overflow is a niche problem. It comes up quite often in various contexts in my experience. I went on with multiplication interface, because it applies to addition as well, and I think is a good basis for a generic arithmetic interface (or only acceptable basis).

Generic code is and will be more efficient than specific code, as systems get more complex and the line between hardware and software blurs (especially if we finally have free markets on production of these), most optimization will be done by compliers, as a simple matter of separation of concerns. It is also our only hope at avoiding degenerate hardware/software that is optimized to run old hardware specific code. Generic code must be as expressive as possible, rejecting abstraction that do not properly capture the essence of the algorithms, otherwise it's not generic, it's a mistake.

I wish I could write a this niche library of arithmetic that supports fundamental types, but I can't without breaking the C/C++ standard.

@amiloradovsky@functional.cafe @gassahara@mstdn.io But I don't want hardware dependent pieces in my code ToT I want them in the compiler. The language must be expressive to make the job of the compiler writers easier, and also fit similar use cases for types that have nothing to do with the hardware. If you want it to work with native ALU word or smallest SIMD/GPU size that can fit the input for best performance, or a special type representing one ternary bit to be compiled into a circuit diagram, or something that's not even a number in conventional sense, you customize that through the type system, while the algorithm and the interface remain generic, cause it's arithmetic, I didn't make it up, ancient mathematicians did, and it survived to this day. As it stands the C style arithmetic types are edge cases that I have to work around, while their essence can be easily captured by the interface that returns two digits/numbers. A type that only needs lower part and just doesn't care if it overflows, can always set the higher part to zero (or special zero type if you have a type system) and the compiler can see that and optimize. An infinite precision, or more precisely an "assumed to not overflow" (I'm looking at you, int!), type can do the same.

@amiloradovsky@functional.cafe @gassahara@mstdn.io when I was writing generic code, and It wasn't any number of bit, it was digit A and digit B, and I needed to add them (not advanced enough to multiply yet, but some day), wishing it to be compiled, optimized and ran on all imaginable and unimaginable architectures/platforms (including those that are used to design hardware) for all possible and impossible purposes. This is my fetish yes.

@amiloradovsky@functional.cafe @gassahara@mstdn.io I don't know much about hardware, if the word halfing is more efficient than, say, a separate "now give me the higher word" instruction, or if some other hardware subsystem has to do it, then sure, I'll let the compiler handle it. I was just yelling at C, and every other language that inherited(or originated) that arithmetic interface. I want to write generic code that works on all arithmetic types, not all but the largest, and even works on weird user defined types if they implement a basic arithmetic interface, not requiring an illusive wider type, that at worst needs arithmetic of its own, and at best a weird bitshift interface. I think there is a whole world of arithmetic between modular and infinite, not even necessarily on conventional numbers.

If I go with the word halfing in generic code, in addition to making the compiler's job way harder, what do I do when the user defined type is a single indivisible digit? I cry, cause it's a backwards hack, not a proper interface for arithmetic.

So, at the end of the day, I have to handle so many edge cases that I never got around to doing it. The only language I know that has something close to what I want is swift, and I hate that so very much...

@amiloradovsky@functional.cafe @gassahara@mstdn.io I'm not talking about bignum algebra, I'm talking about arithmetic. Multiplication of two digits should return two digits, it is simply natural for multiplication. The two digits returned do not magically multiply to 4, that's out of scope. If you want to discard the higher digit for your specific use case or raise an error when higher digit is set, you are free to do so. Someone else might want to discard the lower or implement further multiplication or specific error handling based on the amount of overflow.

The problem with casting to a (virtual) wider type, is that it implies existence of the wider type on which you can do further arithmetic, and for which a wider type does not exist. It's an edge case and not a generic interface for arithmetic.

@amiloradovsky@functional.cafe @gassahara@mstdn.io multiplication has a similar problem, that C just completely ignored, and there are even less excises one can make there. Be it hardware or software, if you implement multiplication for x you need to give me the result in 2x, otherwise don't implement multiplication and give me efficient primitives to do it myself. I think it makes sense to generalize this to addition as well, the high number being a boolean(separate type or not). I don't like the approach of having a biggest type, because then what do I do when multiplying two of those? I don't have a generic interface to work with. Numbers that are strictly allowed or disallowed to overflow are useful as well, but at the foundation we should have more expressive interface.

@georgia you write a program as if it is one program and the webserver takes care of running many instances of that program in parallel for many users(called clients) connected over http. Often used in conjunction with a database management system that solves sharing problems to kinda sorta maintain the illusion of one program, as long as you adhere to certain rules of thumb.

@Meeper

re: huge essay incoming 

@pernia @orekix yeah, that's a good indicator, if you're also decent at algebra your should be all set. @sathariel @waltercool @lain_os @miserablepileofsecrets

re: huge essay incoming 

@pernia I would say it's natural, by my understanding of the word.

I work as a software developer -_- but I'm an aspiring programmer ^_^ I post code, very proud.

@orekix @sathariel @waltercool @lain_os @miserablepileofsecrets

re: huge essay incoming 

@pernia just trying to point out the similarity with the concept of the god, with some additional flare and brave overextension. I guess I have a thing for identifying similar definitions and optimizing usage of words. I would say it's a part of the profession, but I was always annoying like that, so it's probably the other way around.
@orekix @sathariel @waltercool @lain_os @miserablepileofsecrets

re: huge essay incoming 

@pernia So the use of the word is not an expression of particular insight, but an assertion of your beliefs? You mentioned before the natural laws. The laws of everything, you assert. The truth you speak, of the one and only god, whose prophet you are. This is a very common mindset indeed.

@orekix @sathariel @waltercool @lain_os @miserablepileofsecrets

re: huge essay incoming 

@pernia I though I covered that case as well. If absolutely everything is natural than the word is meaningless. Any statement you make with that word will have no meaning, including all statements you made prior. It becomes equivalent of saying whether something exist or doesn't exist.

I'm not intentionally misinterpreting, I'm trying to fish out an understanding. Your initial claim was that only thing unnatural is technology. The anthill however made you veer towards the meaningless above, so I decided to push you with the bones, and see what that would do. Your response was as if there is a fundamental difference between the bones and the anthill, despite your current stance that it is all the same nature. The question wasn't weather they were identical in all respects, but only in the context of distinguishing natural form unnatural, and in that context you considered important to outline this seemingly important difference - the consciousness, and its connection to both the anthill and the technology. Finally this last one being unnatural/artificial in your initial stance is the only meaning you gave to the word nature in this discussion, since the alternative conclusion of everything being natural, as I outlined above, renders it meaningless. Thus I try to convince you, that perhaps intuitively you agree with me on what is unnatural, and that humanity is, well, most unnatural if you will.
@orekix @sathariel @waltercool @lain_os @miserablepileofsecrets

re: huge essay incoming 

@pernia sure I'll go stand in the corner and think about what I just said. The blasphemy. And you perhaps realize that you confirmed what I claimed, calling consciousness unnatural (or the source of the unnatural, which is the same thing in this context).
@orekix @sathariel @waltercool @lain_os @miserablepileofsecrets

re: huge essay incoming 

@pernia didn't your cells create your bones just as intentionally as ants - the anthill.
This line is not drawn anywhere outside of your mind. Thoughts are only things you can call unnatural, and that is where the meaningful definition begins. That is why saying humans are not natural, or defy nature, is only meaningful thing to say. Otherwise everything is natural and the word is meaningless.
@orekix @sathariel @waltercool @lain_os @miserablepileofsecrets

Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.