c: *bloats memory allocation with size information, cause stupid ass programmer can't keep track of a number*

stupid ass programmer: *still has to keep track of the size to be able to use the memory in any meaningful way*

@namark the ineffiencies in raw C are a legitimate problem and why, particularly when it comes to memory management, things like Java can be 40x faster or more out of the box.

Slated memory allocation in C is ultimately the way to go but since its not built-in a lot of programmers dont do it.

That said nothing stopping someone from coding up C properly and making sure they do exactly that. I just wish more coders did.

@freemo there is no one true way to manage memory, java maybe faster at untangling the kind of spaghetti code java fanboys write, but that's not my point, I'm talking about inflexibility of malloc/free in context of optimization of whatever memory management one would want to design for particular purpose, and since it's at the interface level, if you want to do anything more clever you'll have wave goodby to any library code that allocates.

@namark no we arent talking spagehti code, nothing to do with that. We are literally talking about a single operation, memory allocation.. Java's new vs C's malloc, and java's new is substantially faster out of the box compared to C's malloc.

@freemo you are comparing apples to oranges, malloc would be long done by the time java's vm boots and gets to do whatever new even means in that language. You only care about that single new vs single malloc, if you plan on spreading allocations all over the place without thinking of any kind of overall memory management strategy. Not my point at all.

@namark thats exactly what i said, java slates its memory, thus faster than calling malloc on a per-oobject basis in C (which is often how its done, despite not being the proper approach).

As I already stated early to get the performance of java's new operation one would have to pre-slate memory in C. Certainly doable but often not done due to the extra effort when you get that out of the box for free.

My point is, memory handling in C is more tedious to make performant for many often leading to the effort not being made at all.

Follow

@freemo yes I'm saying that exact thing you said, has nothing to do with what I originally meant.

At the same time you try to present Java's memory management as something good, while I'm dunking on it, but that's kind of tangential.

@namark I never claimed it was relevant directly to what you original meant. Its relevance is that both are "things people complain about in C's memory management", that is all.

I am also not presenting Java's memory system as good anymore than I am saying C's memory management system is good. Both are different tools for different purposes and if you think either is the "end all and be all" of what represents a good language, then you are wrong in either case.

Java's memory management means really good performance out of the box. C's memory management is much slower out of the box but with significantly more upfront effort its performance can match or even exceed that of Java's. Therefore both server a purpose depending on your needs.

@freemo Don't think my OP is a "thing people complain about in C's memory management". I believe most everyone are perfectly happy with not having to specify the size in free(), and that is philosophically in line with wanting increasingly more clever allocators and GCs.

Other than that the only difference is what you call "out of the box" I call "for the purposes of writing spaghetti code".

@namark Yea I understand your (incorrect) opinion. It is perfectly possible to write good clean code that isnt spaghetti code in Java as it is possible in most languages. Simply saying a thing doesnt make it so. That said its also possible to write spaghetti code in any language, including C.

Like I said anyone who thinks their language is the only one possible to write good code in no matter the scenario is already behind the 8 ball.

@freemo I mean shagetti in terms of memory management specifically not anything else. That's the whole point of garbage collected languages, the whole philosophy, you want to forget a memory management is a thing and have some magic that takes care of it for you. How can you both forget the problem exists and also design a good solution for your specific application?

Can you sometimes get away with forgetting memory management is a thing? Sure. Other times you end up in a predicament, where you have to study the internals of the garbage collector and somehow someway get it to do some trivial thing you want, that it just won't do properly.

So yeah, with C you won't go very far writing code like that, with java you could, and you might get far enough for whatever purpose, but further you go, harder it'll be to untangle the mess.

@namark Largely it depends. In practice you will find often, though not always, a memory managed system , when done properly, will outperform something like C out of the box (as discussed) and only be marginally slower when compared to C done "right" (as best as possible for the problem. It often is the case that the additional time it takes to code C correctly, combined with the fact that mistakes can lead to bugs not possible in a memory managed system, can often make the extra effort of C an argument that simply isnt compelling.

Its also important to note that memory management in languages like C and some others is not a simple matter of completely ignoring the issue and you still have a great deal of control through use of various types of controls over memory, namely: hard, soft, phantom, and weak memory. All of which can be ignored or used to whatever extent you need to.

So long story short, unless you need a very marginal improvement in memory, or are doing something special with memory at a low level, you probably wont get anywhere near the level of improvement with C that justifies the added effort you'd need.

As I said already, its all about the right tool for the job. Any mentality that things one language or approach to memory management is always superior is ultimately going to put you in a bad spot and waste many hours that need not be wasted by trying to use the wrong tool for the job. C is often the right tool, it is also often the wrong tool. Situational awareness is important to be a good programmer IMO.

@freemo Somehow my experience is the opposite. At certain complexity level you spend a lot more time optimizing with a garbage collected language, in the most backwards indirect ways, to get you application to acceptable level of performance, while with properly designd C you would usually only optimize to be exceptional, not acceptable. Overall of course having to care about memory management is harder than not having to care bout it. And sure it's not black and white, but overall philosophy, the ultimate goals and the history can be summarized as "don't want to care about memory management", or ok "I want to care less about memory management". The irony is that after a while you end up caring more, cause marginal slowdowns eventually add up.

@namark How many years did you spend coding Java exactly? It may be your experience being opposite comes down to the fact that you never gained sufficient experience with the language. While it takes less effort than C to get right, it does take an equal amount of understanding and experience.

For example your talking about spending too much time optimizing in a garbage collection language... if you have the need to optimize your memory management beyond the direct and expected ways of the language then you shouldnt be using that language, thats the point... **most** of the time you shouldnt need to optimize for memory in a convoluted way, the expected ways are more than sufficient and dont create bottlenecks. When that stops being true is when you should probably be at a lower level.

The notion that you only need to optimize memory in C in exceptional situations is far from reality. AS stated if you use it out of the box memory waste is **Extreme** in C.. for example apps like perl and ghostscript written in C that dont optimize the memory system beyond the out of the box use of malloc expiernce about 40% of their run time dealing with allocating and deallocating memory. The same app written in Java would have no where near that load even if no optimization on memory was done beyond the usual normal good practices.

Marginal slowdowns also dont just "add up", the problem is not that simple. If your memory system is 0.1% slower, for example, but your bottleneck is in the processor (and that bottleneck is not due to inefficiency but the difficult of the task itself) then that 0.1% or even a 10% slowdown in memory will have noe ffect on runtime at all since despite the memory access being slower it can complete before the processing queue is complete and thus never effects runtime.

The line of thinking your using usually leads people do to what is called "premature optimization" which has the counterintuitive effect of often **slowing down** your code rather than speeding it up by optimizing code that is not contributing to a bottleneck, and thus introducing more chances for human error in your code.

@freemo I don't count years per language, and it doesn't matter if you can't follow basic reasoning. I worked with various garbage collected language (as most of them are these days) and with every single one, once the project reaches certain level of maturity memory related performance becomes a big issue that a lot of time is spent on, to the point where a pinnacle of the language study becomes the study of the GC.

>out of the box memory waste is **Extreme** in C
yes spaghetti code memory waste is extreme, that's why you never get far with it, and that's why if you got far, you have a good design that fits your application and you don't have to worry about optimizing just to be at the acceptable level of performance.

> apps like perl and ghostscript
wow what an example, garbage collected language can't do well without a memory management of a garbage collected language, unbelievable.

>The line of thinking your using usually leads people do to what is called "premature optimization"
Exact opposite, good desing is not a permature optimization, and if you are using a language that is in general well optimized, you don't need to worry about micro-optimizations. Premature (or otherwise) micro-optimizations are much more prevalent in less optimal languages, where they actually can have an impact.

I think we went over this before, but which project was it that aprappi competes with objectively? I guess no such thing... or if you allow yourself to remove the java/jvm fanboyizm, you might actually say llvm. Is aparapi in any way better than llvm? Guess not as it uses llvm doesn't it? Oh look llvm itself actually uses a decent language, who would have thought? I wonder if they will ever have to rewrite a significant portion of the code in Java...

@namark If your reaching the point in a project where memory issues wind up costing significant investment of time late in a project then the project either picked the wrong language or simply doesnt have good programmers.

My job is more often than not to get called in to write or fix software that the current less expiernced team is not capable of making performant on their own. Sometimes that means I need to rewrite it in C, but this is very rare. More often than not the problems arise from bad design choices from the start of the problem and are easily fixed with the right expiernce without the investment of time you suggest.

That said my own projects are ones that rely on being extremely efficient as a purpose of their design (for example Aparapi's whole point is GPU accelerating java code , so efficiency is paramount). In all these cases performance was done right from the get go and the problems you describe never arose. Yes I would profile and sometimes need to optimize. But this process was always a relatively easy and short process.

So my assumption is that either your own expiernce was simply lacking in the languages you had this problem in, and you made poor design choices, or your team was making poor choices. Not the language itself.

As an example a national trucking company with tens of thousands of trucks on the road across the USA called me in to optimize their system. It was mostly graph optimization problems (think traveling salesman sort of stuff), so extremely sensitive to memory management and other efficiency issue. Their current software written in C was massive and ran across an entire server room about about 50 computers on racks. I was able to improve the rate at which it computed solutions by 10 fold and reduce the number of computers from 50 down to 1 (though there was 1 or 2 extra for redundancy, but these were idle for the most part). Best part yet it was written in a combination of Ruby and Java in the end, both memory management languages.

I can easily count the amount of times that teams of programers overs years were incapable of making a system efficient in C that I was effortless able to replace with memory managed languages that did the same task many of orders of magnitude faster. All while doing it in a fraction of the development time that the C programmers invested.

That said I can also count the number of times the reverse is true, that I used C to replace a memory managed language and was able to achieve an improvement in performance as well.

@freemo Nice brag, keep trying to overpower my arguments with sheer power of your authority, it's super effective. I share with you my experience in hopes it rings a bell, not to prove a point and present my argument about the specifics of the languages alongside in case you didn't notice. Some random proprietary crap or a niche open project being total failures does not say anything about the languages used. If your goal here is to somehow present empirical evidence, find and compare large and successful open projects, that directly or indirectly compete.

@namark ::rolls eyes:: its not about bragging. I shared my experience for the same reason as you, because it contradicts your claims much as you feel your experience contradicts mine. My goal in mentioning this is no more empirical than your own. Simply put, I've seen for myself that your claims have been wrong in countless projects, that experience has value, experience is hard to objectify, so take it as you will.

Either way I see you've reached that point as you always do in conversations where your own obnoxiousness outweights any ability to have an adult conversation. I'll be muting this conversation now, have a good day.

@freemo I didn't share experience to contradict, It was contradicting, but insofar that I shared it I was hoping to find something in common. Like the study of GC, technically an implementation detail, as the pinnacle (or at least an important milestone) of practical study of the language, which is rather ironic and goes against the whole premise. In your last brag there was simply nothing for me to latch on even, literally no argument or reasoning, you were just asserting. But I guess that's what one does when we "reach that point in conversation".

@namark My assertion was about both contradiction and agreeing in a sense. On the one hand I pointed out that with some classes of problems what you described to be true (And thus in line with your expiernces). Yet with other classes of problems you are incorrect. This was my argument from the get go.

So my point was always about your stance as an absolutism is incorrect but can be true situationally.

If we are really going to be critical of the meta-argument where you seem to be talking now I'd say my stance has more common ground to connect to (not being an absolutism and all) than your own. That said I personally dont find such metadiscussion useful beyond reflecting on how we can communicate better in the future.

@namark A good example of this at play is actually the Aparapi app I wrote. It started as relatively pure Java, then the parts that made more sense as C were replace with JNI (basically C code integrated into Java). In the end the app uses C when memory management or low level control has an advantage, and Java where it serves no advantage.

Since it was written well there was no issue of "untangling", the boundaries were always clearly defined. Doing the whole thing in C from the get go would have been the wrong move just as doing it all in Java was the wrong move...

right tool for the job.

@freemo more like wrong tools for the wrong job if you ask me

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.