Been doing a lot of coding in python lately. While some of the hackability of the language is nice for doing some cool things, overall, I cant say im a fan. Its a little too obscured for writing algorithms where effiency is important. Its hard to tell what data types are truly backing various variables and thus makes it tricky to pick the right implementation. I think im working with raw arrays and they turn out to be linked lists or worse. Even then things like a "Dict" isnt always clear if its a Tree implementation or a hashmap implementation or something else entirely.

Granted this isnt an impossible task, I have managed to figure it out by pulling open the source code of the libraries I call and using profiling tools. But python seems to not care or obscures a lot of that.

@freemo I think that python is optimised for reducing *development* time. It's good for when you need to do something quick-and-dirty, the sort of thing where you need to only ever run it once or twice and you're really not going to be stretching the processing or memory bounds of the computer. The sort of thing where it's important that the code be readable, because there's a good chance that if you ever want to run it again, you'll need to tweak it a bit, first.

And in that narrow domain, it's really very good. But outside of that domain, it's probably not the best choice...

@ccc

While I might agree that it is geared towards shorter development time it seems odd to me it finds so much use in scientific circles considering its not well suited for it.

Also I think readability can be interpreted in different ways here. I'd say due to the ability to hack code it really isnt very readable in a meaningful way. You can look at code and have no idea what it is **really** doing and can be rather difficult to find out. thats the opposite of readability in my mind.

The only way I sorta see the readability thing is just with the tabs and the syntax it might be easier to scan lines of code, but thats a small part of readability in my eyes.

But I do agree that it does a fine job at minimizing single-developer start time on small projects where performance doesnt matter.

@freemo Python is used in a lot of places that it is not really all that well suited for. This seems to be largely because someone discovers how great it is with short, quickly coded, demonstration problems - which is where it is at its best - and then starts trying to use it for everything.

Now, it can *do* just about everything; and having a shorter development time is a solid advantage in a large slice of 'everything'. It's just that, once you get away from simple demonstration problems and into large, serious projects, then there are usually other languages that work out better; but even there, there's a very large category of problems where the difference might be fairly marginal.

So it gets used in a lot of places where it's not the best choice - because in some of those places, it's still a *decent* choice, just not the *best* choice.

@ccc @freemo

I mean every Turing-complete language can “do just about everything”…

I got turned off from python a while ago. It was the lack of static typing, the overcomplicated bizarrely behaved parallel processing, and the fact that if my text editor is incapable of understanding a program’s structure, then how the hell am I supposed to understand it?

Oh and the fact that it’s slow.

Every time I wrote a python extension to go with my old awful python code, and it segfaults from a double free again with no way to tell how or why it happened, I just started asking myself, “Why don’t I just write EVERYTHING in C?” At least C has a debugger that isn’t pants-on-head retarded.

@cy

While I do agree with your sentiments on python I also wouldnt go so far as writing everything in C. C is not well suited for many things and certainly isnt a good drop in for high level scripted languages. But otherwise your point by point on python seems accurate to me.

@ccc

@freemo @ccc C is a little verbose, but if you just go with certain standard extensions (POSIX, basically), you can do a lot of “high level” stuff with it pretty cleanly. It’s when you try to be “cross platform” that C starts getting pretty nightmareish, mostly thanks to Microsoft and Apple.

Or rather, I can do a lot of high-level stuff. It’s a good language if you are the sort of person who forgets what module you imported the “foo” function from like 2 minutes ago. It’s not a good language if you don’t want to sit there typing “very_important_module_foo()” every time. You definitely shouldn’t use it if a different language works better for you.

Follow

@cy

I am using high-level here to include certain ideas such as:

* platform independence

* minimal resource management (memory management)

* implied performance improvements (memory slating, cacheing of certain operations that are repeated, etc)

* automatic handling of by-reference and by-value in a safe way obscured from the coder

and many more other things. while you can wrangle C into doing any of those things, as well as many of the things I didnt mention, it is not intrinsic to the language.

@ccc

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.