Give ne back my hat!

Are we agreed Python is shit?

Turns out that's what they teach in school nowadays

Crazy :s all over the place

Indentation is meaningful.  Wtf. We're not writing Fortran 66.

These two are enough to drive you crazy

The editor it comes with is retarded, and positively hostile

Etc.
Permalink MobyDobie 
November 10th, 2017 7:45pm
The editor it comes with?
Permalink Send private email Almost Anonymous 
November 10th, 2017 8:03pm
Idle is the IDE.

There are nice Visual Studio extensions.
Permalink Legion 
November 10th, 2017 8:34pm
Python has it's place.

The biggest problem isn't the indentation. Indentation is meh after you get used to it.

The real problem is the language breaking changes after version 2.7

I refuse to update to 3.0 or whatever to they have out.
Permalink Legion 
November 10th, 2017 8:39pm
We moved to 3.x. Everything worked.
Syntax including indentation is fine.
Language is fine.
It’s not fast and you don’t have types.
Many libraries around.
Quick & easy for shortish programs. Say < 10K
Permalink Q 
November 10th, 2017 10:14pm
I've been using emacs for Python.

They really did mess up the Python 2/3 thing.

I don't see it as substantially better than PHP.  The runtime performance is a little better.  For pandas/numpy, it gives nice wrappers to C libraries.
Permalink FSK 
November 10th, 2017 11:54pm
It's incredibly fast, Python.  Don't let anyone tell you otherwise.
Permalink Tristan 
November 11th, 2017 1:32am
I really like Python. It has a variety of uses it can run cross platform and if you use ironpython it has .net interop which is really useful if you are in a .net shop but don't want to use powershell. It is great for utility Scripts.
Permalink Ruseman 
November 11th, 2017 5:58am
The big issue as far as I am aware of is the GIL, which makes it hard to implement concurrency.
Permalink NPR 
November 11th, 2017 6:26am
https://wiki.python.org/moin/GlobalInterpreterLock
Permalink NPR 
November 11th, 2017 6:35am
IronPython has no GIL and thus can effectively exploit multiple cores.
Permalink NPR 
November 11th, 2017 6:36am
Most people don’t write multithreaded Python. If you want parallelism, message passing and multiple processes is common. See multiprocessing library for example.
Permalink Q 
November 11th, 2017 7:01am
>It's incredibly fast, Python.

Some of the libraries are, Python itself not so much.

I agree that indentation is a non issue, as is the IDE it came with, I never even bothered trying it, just went for pycharm.
Permalink libtard_uk 
November 11th, 2017 8:07am
The only people who complain about meaningful indentation are people who are new to it and used to languages that don't do it. It's like granddad being used to his mangy old chair.

The GIL and the slowness are only problems if you are writing high performance low level code.

Python wasn't meant for low level code any more than Rust was meant for writing web apps and Excel macros. Python was meant to interface with and orchestrate high performance low level code.
Permalink . 
November 11th, 2017 11:07am
+1 .
Permalink .. 
November 11th, 2017 11:29am
I'm a Python fan, mostly. I'll just repeat what I posted the other month: Python is the least insane of the scripting-type languages I've used, though that's not to say it has no issues. The virtualenv stuff sucks, and the 2.x vs 3.x split is an ongoing (if less so over time) problem.

(Mostly, the indentation is a non-issue. I use Emacs and it does pretty much the right thing. Every now and again I do find myself having to do more manual fixup than I'd like, however, because indentation levels are explicitly opened but not explicitly closed.)

But, compared to Perl, it isn't fundamentally crazy. Compared to ruby, its users aren't mainly clueless fruitloops. Compared to Javascript, it's got at least a bit of taste, and, being a lot less popular, attracts fewer morons.

Overall I'd say I've generally been a happy Python user, but as with any dynamically-typed languages it doesn't scale super-well and once my programs get past about 1,000 lines I do start to get the urge to redo them in C++ or (when appropriate) C#.
Permalink brone 
November 11th, 2017 12:17pm
Also good compared to Javascript: the standard library is good-sized, so you don't end up having 50 dependencies for just a little one-off script. For proper externally-facing production quality stuff you'd probably have to look further afield in some cases - I doubt you'd want to run a web site off the standard library's HTTP server for example? - but for the stuff I've used it for this has been pretty rare.

Which is good, because the way Python does this stuff isn't half as neat as npm. (Which is at least partly why Javascript programs have so many damn dependencies: the Javascript standard library sucks, but it's easy to create and use self-contained packaged libraries.)
Permalink brone 
November 11th, 2017 12:21pm
++ what brone said.

Not so good for programming in the large.  Not fast (if you think it is I can only imagine you have no perspective on how fast modern hardware really is).

Personally I'd prefer indentation *and* syntactic block boundaries that must corresepond.  With a standard auto-formatter.  Redundancy is safety, after 30 years I've just seen too many 'hidden in plain sight' bugs in this area.

Python seems mostly very rational and a good choice for teaching IMO.  I learnt to program on an old, typeless and very friendly language called Data/BASIC (Pick BASIC), with hindsight that was a hugely better place to start than Java or C (or God forbid C++)
Permalink Trog 
November 11th, 2017 1:14pm
Surprised to hear you are getting stopped out at 1000 lines. That’s what two weeks or a month of work?


Some folks who like types feel that way and enjoy go or Julia. Unfortunately the library ecosystem isn’t quite there. Especially for scientific computing
Permalink Q 
November 11th, 2017 1:37pm
What about Swift?
Permalink Yoda 
November 11th, 2017 2:11pm
what about it?

you can program your iphone?

i don't hear much about it
Permalink Q 
November 11th, 2017 3:06pm
Who cares? You can make good money with it.
Permalink Shylock 
November 11th, 2017 6:03pm
With python? I wouldn’t imagine the rate is anything extra ordinary
Permalink Q 
November 12th, 2017 10:09am
All the data science jobs ask for it. Some of them (not all by any means) are quite lucrative.
Permalink Shylock 
November 12th, 2017 10:23am
Ah yes. Python + data science seems quite hot. USD 250K?
Permalink Q 
November 12th, 2017 10:49am
Cut the zero and you're at it.
Permalink Io 
November 12th, 2017 11:07am
Maybe in Eastern Europe
Permalink @oddball 
November 12th, 2017 11:27am
>USD 250K?

Some. However, the Indians have put the key words on their resumes and now I'm seeing data science jobs for 50K. But there are still some places who'll pay that kind of decent rate.
Permalink Shylock 
November 12th, 2017 12:55pm
Python alone is not worth $250k. Python + TensorFlow + domain knowledge - now we're talking.
Permalink Yoda 
November 12th, 2017 1:03pm
At my last job, I had an ex-coworker who was completely useless, no clue at all.  He now works as a "Data Scientist".
Permalink FSK 
November 12th, 2017 1:27pm
Whether something is worth anything or now is of no import. All that matters is what the market will bear.

I'm wondering how much longer this data science fad will last. Where I am now, they created a data lake. That turned out to be PFU.
Permalink Shylock 
November 12th, 2017 1:45pm
PFU?
Permalink Q 
November 12th, 2017 3:38pm
Pretty Fucked Up?
Permalink Yoda 
November 12th, 2017 4:28pm
Pretty Fucking Useless.
Permalink Shylock 
November 12th, 2017 4:37pm
I suspect that's what the NSA have created.
Permalink Zed 
November 12th, 2017 5:14pm
Why was it useless Shylock?
Permalink @oddball 
November 12th, 2017 6:14pm
The data lake? Because all a data lake is is putting the source system data in one place and thinking you've accomplished something. The hard part is figuring out how to ask the data questions and get useful answers from it. That's slow and hard and detailed and you have to interact with the business people a lot.

You build a data lake, you've accomplished jack shit but you hit a milestone that the people with the money will think is cool.
Permalink Shylock 
November 12th, 2017 6:17pm
Having the data in one place seems useful
Now I can join what used to be in multiple databases, files, etc.

You seem attached to believing your part of the process is the main thing.
Permalink @oddball 
November 12th, 2017 7:45pm
It is only one small step in a long process. It's basically just adding another step to the staging process. Now, if there's some good reason to just dump your source data in one place (maybe something to do with scheduling?), then OK, but otherwise it adds no value to the process.

You still have to clean the data, then transform it, then create report datasets, then present those datasets. It's much ado about very little.
Permalink Shylock 
November 12th, 2017 8:04pm
Okay you were there you know much ado it was
Permalink X 
November 12th, 2017 8:29pm
I've been doing this since the mid 1990's.
Permalink Shylock 
November 12th, 2017 8:39pm
Data Lake = Gowanus Canal?
Permalink FSK 
November 12th, 2017 9:08pm
Heh.
Permalink Shylock 
November 13th, 2017 4:35am
Hm, thanks for that note about the GIL. I didn't know about that.

I've been trying to get a real time low latency application working in Python. I've done a lot of work carefully managing locks to be in and out of critical sections quickly, but I still don't get the performance I need.

Now I'll be looking into if this GIL is the problem and Python is simply not suitable for some categories of applications.
Permalink Reality Check 
November 13th, 2017 6:18am
"Real-time low-latency", in Python?

Yeah, I don't think so.  Unless your "real-time" is within about a second (not a milli-second, not a nano-second, but a full second).

And your "low-latency" is also within a second.

With those constraints, Python might work fine.
Permalink SaveTheHubble 
November 13th, 2017 8:27am
I need sub-millisecond latency and jitter in response to real time data coming in on interrupts. So I have the interrupt thread, a timer thread, a scheduling thread, and the interface/interaction thread on top of it all. The interface thread timing is not as important.

Problem right now is the jitter, around 20ms. Not good enough. Probably these locks.

Your saying you can't get below 1 second of accuracy is completely wrong so I don't find your comment useful.
Permalink Reality Check 
November 13th, 2017 10:20am
I would also add PERL to that unholy list as well.

Both languages are a pain in the ass to work with.

For my money R is the bomb.com. There are so many reasons to learn it that they are too numerous in number to list.

Bet on it.....and take it to the bank.
Permalink Brice Richard 
November 13th, 2017 12:52pm
Why do you prefer R to Python?
Permalink Wonder Woman 
November 13th, 2017 2:25pm
20 mS is the typical task-switch minimum 'tick' of Linux and perhaps Windows (I haven't measured Win-10).  So it would be difficult to get better than that, without having a real-time based OS as well.  Like VxWorks perhaps.  Or maybe OS-9.

I don't know what the actual latency is of Python -- that's like asking if a VW-Beetle can out-run a Lamborghini.

Sure, it could, if the Lambo was broken, or being driven by an idiot.  But most people wouldn't try it.
Permalink SaveTheHubble 
November 13th, 2017 3:56pm
For time-critical real-time applications, isn't the issue usually that you never really know when most OSs are going to suddenly decide to go off and give priority to another process?
Permalink Z 
November 13th, 2017 4:02pm
Correct.  Real real-time OS's have predictable (or even guaranteed) interrupt latency bounds.

Using Linux as a pseudo-real-time OS can be done, if you can accept wider bounds.  Since Linux is free, and real real-time OS's tend NOT to be free, that's a nice situation to be in.

If you can't accept the wider interrupt latency bounds, perhaps a nice satellite processor external to your main workstation can handle the real-time aspects, while the wider bounds of the Linux workstation can handle the user interface aspects -- which tend to be more forgiving.
Permalink SaveTheHubble 
November 13th, 2017 4:05pm
Hubble, do you mean mS = microseconds or miliseconds?
Permalink Yoda 
November 14th, 2017 3:15am
Hubble is just talking shit

Read instead

https://stackoverflow.com/questions/16401294/how-to-know-linux-scheduler-time-slice
Permalink Hi there 
November 14th, 2017 4:14am
Maybe the answer is to look at what is done where it REALLY matters?

HFT
Missile guidance systems
Braking systems in cars
Permalink Assad's brighter brother 
November 14th, 2017 4:27am
mS is milli-seconds.

uS is micro-seconds.  The letter 'u' standing in for greek letter mu.

pS is pico-seconds.
Permalink SaveTheHubble 
November 14th, 2017 3:23pm

This topic is archived. No further replies will be accepted.

Other topics: November, 2017 Other topics: November, 2017 Recent topics Recent topics