I never really took Python seriously, but lately I've been programming almost all of my personal projects in Python. The way I see it: For any kind of project I might take on, Python is never the best language to do it in, but is almost always the second-best language to do it in. This makes it a great default, especially if it's just a little one-off tool or experiment.
> Python is never the best language to do it in, but is almost always the second-best language to do it in.
I've been writing python from the last century and this year is the first time I'm writing production quality python code, everything up to this point has been first cut prototypes or utility scripts.
The real reason why it has stuck to me while others came and went is because of the REPL-first attitude.
A question like
>>> 0.2 + 0.1 > 0.3
True
is much harder to demonstrate in other languages.
The REPL isn't just for the code you typed out, it does allow you to import and run your lib functions locally to verify a question you have.
It is not without its craziness with decorators, fancy inheritance[1] or operator precedence[2], but you don't have to use it if you don't want to.
Welcome to Rakudo™ v2025.06.1.
Implementing the Raku® Programming Language v6.d.
Built on MoarVM version 2025.06.
To exit type 'exit' or '^D'
[0] > 0.1 + 0.2 > 0.3
False
Raku uses a rational type by default for those which will give an exact value. If you use Python's Fraction type it would be equivalent to your Raku. The equivalent in Raku to the Python above would be:
Note that "On overflow of the denominator during an arithmetic operation a Num (floating-point number) is returned instead." A Num is an IEEE 754 float64 ("On most platforms" says https://docs.raku.org/type/Num)
Python always uses IEEE 754 float64, also on most platforms. (I don't know of any Python implementation was does otherwise.) If you want rationals you need the fractions module.
>>> from fractions import Fraction as F
>>> F("0.1") + F("0.2") == F("0.3")
True
>>> 0.1 + 0.2 == 0.3
False
This corresponds to Raku's FatRat, https://docs.raku.org/type/FatRat. ("unlike Rat, FatRat arithmetics do not fall back Num at some point, there is a risk that repeated arithmetic operations generate pathologically large numerators and denominators")
that said, decimals (eg 0.1) are in fact fractions, and the subtlety that 0.1 decimal cannot be precisely represented by a binary floating point number in the FPU is ignored by most languages where the core math is either integer or P754
bringing Rational numbers in as a first class citizen is a nice touch for mathematicians, scientists and so on
another way to look at it for Raku is that
Int → integers (ℤ)
Rat → rationals (ℚ)
Num → reals (ℝ)
Given how slow Python is, isn't it embarrassing that 0.2 + 0.1 > 0.3 ?
I have some test Rust code where I add up about a hundred million 32-bit floating point numbers in the naive way, and it takes maybe a hundred milliseconds, and then I do the same but accumulating in a realistic::Real because hey how much slower is this type than a floating point number, well that's closer to twenty seconds.
But if I ask Python to do this, Python takes about twenty seconds anyway, and yet it's using floating point arithmetic so it gets the sum wrong, whereas realistic::Real doesn't because it's storing the exact values.
If you have a hundred million integers to add, please, by all means, use Rust, or C, and intrinsics for the features not yet in your favorite math libraries. You can call Rust from Python, for instance. Polars is excellent, BTW. This seamless integration between a nice language amenable to interactive experimentation and highly optimized code produced in many other languages I don't want to write code in is what makes Python an excellent choice in many business cases.
Er, yes, I'm aware why this happened, my point is that this happens in the hardware floating point, but Python is as slow as the non-accelerated big rationals in my realistic::Real (it's actually markedly slower than the more appropriate realistic::Rational but that's not what my existing benchmarks cared about)
Not really. It's a limitation of the IEEE floating point format used in most programming languages. Some numbers that look nice in base 10 don't have an exact representation in base 2.
Rational numbers where the denominator is a multiple of the prime factors of the base have an exact fractional representation in that base.
1/3 doesn't have an exact representation in base 10 or base 2. 1/5th does have an exact representation in base 10 (0.2), but doesn't in base 2. 1/4th has an exact representation in base 10 (0.25) and in base 2 (0.01)
That said, changing how you think about programming... even with jshell I still think Java in classes and methods (and trying to pull in larger frameworks is not as trivial as java.lang packages). However, I think Groovy (and a good bit of Scala) in a script writing style.
jshell itself is likely more useful for teaching than for development - especially once you've got a sufficiently complex project and the ide integration becomes more valuable than the immediate feedback.
Still, something to play with and one of the lesser known features of Java.
Welcome to Rakudo™ v2025.06.1.
Implementing the Raku® Programming Language v6.d.
Built on MoarVM version 2025.06.
To exit type 'exit' or '^D'
[0] > 0.1 + 0.2 > 0.3
False
It's the result if your programming language thinks 0.2 + 0.1 means you want specifically the 64-bit IEEE floating point binary arithmetic.
But, where did we say that's what we want? As we've seen it's not the default in many languages and it isn't mandatory in Python, it's a choice, and the usual argument for that choice would be "it's fast" except, Python is slow, so what gives ?
You aren't answering my question. I asked worik why they're claiming that something that happens is false. It's insane to claim that reality isn't reality. Run that in a Python REPL and that is the result you get.
Interesting observation about Python providing the worst of all possible worlds: unintuitive arithmetic without any of its speed advantages.
But in answer to “where did we say that's what we want?” I would say, as soon as we wrote the expression, because we read a book about how the language works before we tried to use it. Αfter, for example reading a book¹ about Julia, we know that 0.1 + 0.2 will give us something slightly larger than 0.3, and we also know that we can type 1//10 + 2//10 to get 3//10.
> I would say, as soon as we wrote the expression, because we read a book about how the language works
I'm comfortable with that rationale in proportion to how much I believe the programmer read such a book.
I haven't taken the course we teach say, Chemists, I should maybe go audit that, but I would not be surprised if either it never explains this, or the explanation is very hand-wavy, something about it not being exact, maybe invoking old fashioned digital calculators.
The amusing thing is when you try to explain this sort of thing with a calculator, and you try a modern calculator, it is much more effective at this than you expected or remember from a 1980s Casio. The calculator in your modern say, Android phone, knows all the stuff you were shown in school, it isn't doing IEEE floating point arithmetic because that's only faster and you're a human using a calculator so "faster" in computer terms isn't important and it has prioritized being correct instead so that pedants stop filing bugs.
It's a beautiful and expressive language, and I use it a lot. My only gripe is how prone it is to bit rot, too sensitive to version changes in libraries or runtimes. It needs frequent babysitting if you want to keep everything updated.
Even if it's less secure, the beauty of Go, a single, static binary impervious to version changes, is very appealing for small projects you don't want to keep returning to for housekeeping.
I have written a not inconsiderable amount of Python. But the problem I found was that I spent too much time debugging runtime issues and I do not enjoy that. I found that while it took longer to write correct Rust to solve problems I wanted solving, I enjoyed that so it was a better deal.
Winners across many markets seem to get the importance of simplicity. Compare the Google homepage to the Bing homepage. The dashboard of a Tesla to that of a Volkswagen. Remember how hardcore lean Facebook's UI was compared to MySpace?
The main reason I don’t like Python very much (besides performance) is that it makes what should be simple tasks complex due to the expression problem, arising from its use of class-based object orientation. You can avoid some of the issues in your own programs (but can’t escape insanity such as ",".join(["a", "b"])), but as soon as you delve into a library of any complexity to make alterations, you’re mired in a morass of methods encased in classes inheriting from other classes, and nothing is straightforward. Discovering Julia, which solves the expression problem¹, was enlightening. Even if it were as slow as Python, I would still prefer it: it is simpler, more beautiful, and more readable.
My use case is scientific computing and for that Python is excellent thanks to Numpy, IPython and Numba. It's often possible to write code that is nearly as fast as C, but it's far easier to write and debug, since you can just inspect and run code snippets in the shell. In that regard, it's similar to MATLAB, but it's FOSS and faster with the right libraries.
My background is in philosophy and formal logic. The idea that python isn’t the #1 language used is insane to me.
Yea, I’m sure there is a lot of technical reasons to use other languages, but with python, you can just read it. I remember buying “learn python the hard way” about 15 year ago, and just looking through the book thinking… wait, I can already read this.
I'm surprised at the negative comments about Python here. Python has been my favorite language since I learned it, and nothing else has come close to it.
I'm currently on pure JS project (node and vue) and I'm constantly fighting with things that would be trivial in python or django. Also getting into .NET world and not impressed with that at all.
I know folks like Go, but in decades of being a software consultant I've come across zero companies using it or hiring for it.
Regardless of the reasons why, the fact is python works well enough.
There are many things I wish python, or a language like python, could improve on. Yet despite all my wishes, and choosing Rust/Go more often recently, Python still works.
I’m in a love/hate relationship with python. I think it’s becoming more of acceptance now haha.
I am not a keyboard warrior who got caught up in the nonsense, but I think some people were simply annoyed at adding syntactic sugar for very marginal benefit. “There should be one way to do things” mantra.
I have a long list of grievances with Python, but the walrus situation would never crack my top ten. Put effort into removing cruft from the standard library, make typing better, have the PSF take a stance on packaging. Anything else feels a better use of time.
Whatever, it won. I will never use it, but when I see it will have to scratch my head and lookup the syntax rules.
I think Python's centrality is a consequence of its original purpose as a language intended for instruction.
Yeah, some of its design decisions required immense cost and time to overcome to make for viable production solutions. However as it turns out, however suboptimal it is a language, this is quite made up by the presence of a huge workforce that's decently qualified to wield it.
Python original purpose was as a scripting language for Amoeba. Yes, it was strongly influenced by ABC, an introduction programming language which van Rossum helped implement, but that was a different language.
""I was working in the Amoeba distributed operating system group at CWI. We needed a better way to do system administration than by writing either C programs or Bourne shell scripts, since Amoeba had its own system call interface which wasn't easily accessible from the Bourne shell. My experience with error handling in Amoeba made me acutely aware of the importance of exceptions as a programming language feature.
It occurred to me that a scripting language with a syntax like ABC but with access to the Amoeba system calls would fill the need."""
It is probably the first language for 99% of the computer science students who didn't know any programming before college. And even for those who knew programming, chances are that a lot of them have at least dabbled a little with it.
It's easy to forget how big Perl was before the 10 Year Stall happened. (I.e. when development on Perl 5 was stalled in wait for Perl 6 which never happened. By the time Perl 6 was renamed to Raku, it was too late for Perl 5. It now lives on, but it lost a lot of momentum.)
Maybe one factor is its versatility in leetcode, which to a first approximation every SWE has to do.
Declare sets, lists, and maps in one line. Don’t need to worry about overflow or underflow. Slicing is extremely convenient and expressive. Types not necessary, but that’s rarely confusing in short code blocks.
Compare to js, you’re immediately dealing with var/let/const decisions using up unnecessary mental energy.
I've been programming only in Python for a while now. I got a certificate that required digging into the language enough to do the code in your head. I code projects in VS Code. I've enjoyed it, esp library availability.
I do warn people that it's not as easy or intuitive as advertized. I've often been bitten by unexpected errors. I think a survey of these might be worthwhile.
One was typing or conversion errors. Some conversions, like int-to-str in a string concantenation, seem pretty obvious. That isnumeric() didn't consider negative numbers as numbers was a surprise.
Sometimes it's consistency. I've often alternated between lists and sets in applications. I prefer to keep most data as a list but use sets for uniqueness checks or redundancy filtering. Despite being collections, one uses .append() and one uses .add(). Little differences not only add confusion: I have to modify my codebase if mixing or switching them which can become a type error later in another spot.
Another were common operations usually in one place were split across two. That happened with time vs datetime and filesystem operations which might take two modules. I've considered making a wrapper that turns all those into one thing with internal redundancy removed. There might be a reason others haven't done that, though.
Another issue was distribution. I can do straightforward building of console apps for two platforms. That's reliable. If worried about reliable, Python apps seemed easier to deliver as a Flask site than distribute my utilities as standalone. Nikita was really impressive, though, in terms of the work that must have went into it.
In my case, I also missed the ability to easily make linked lists in C to build trees. I wanted to build a C-like tree in Python but it couldnt do self-referential structures IIRC. Since that app requirements were C-like, and performance was no issue, I actually simulated a C-like heap in Python, ported a C tree to it, and build the tool on that. I also got a use-after-free in Python of all things lol. Anyway, I was surprised there was a data structure C could do but a high-level, GC, reflective language couldn't. There might be a trick for this others know about but I figure they just wrap C code for those things.
On the positive side, the debugging experience with Python was so much better than some beginner languages. I'm grateful for the work people put into that. I also love that there are easy to use accelerators, from JIT's to the C++ generator.
I was wanting an acceleratable subset with metaprogramming when Mojo appeared. I might try it for toy problems. I gotta try to stay in idiomatic-ish Python for now, though, since it's for career use.
I continue to believe that python is only still popular for the ecosystem effect. Students are taught it, a bunch of libraries were written for it, now everyone keeps using it.
But its syntactically weak? Python itself is slow? pip is awful (although uv is pretty good). Sometimes I am forced to write python because the packages I need are written for it but I always hate it.
> Students are taught it, a bunch of libraries were written for it, now everyone keeps using it.
This is revisionism.
When I was in school, that's what people would say about C/C++/Java.
People like me switched to Python well before it became common to teach it at school. Lots of great libraries were written in it before then as well. I mean, it's really easy to write a library in it than it was in most other languages.
It was picked for teaching in schools because it was a decent language, and was already widespread. It's much more useful to teach Python to an average engineering student than C/C++/Java.
It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language: Perl. When I was in grad school, I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. There is no way they would have done that in any of the prevailing languages of the time - many of them learned to program while in grad school. It became quite popular amongst linguists. Then NumPy/SciPy became a popular alternative to MATLAB, and once again my peers, who would never write C++ to do their work, were suddenly getting matching speed by using Python. That's how it ended up being the language for ML.
So no - the fact that it's taught in schools is not the reason it's used today.
> It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language ... I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. ... many of them learned to program while in grad school.
Sure, and this is my argument. It is easy to start out with, which makes it appealing to academics without a CS background. But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.
Just because it is easy to write doesn't mean it is the most appropriate for science. It is not commonly acceptable to publish papers without a standardized format and other conventions. Students put in work to learn how to do it properly. The same should be true for code.
> But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.
For many of them, the alternative is that they simply wouldn't have done the science, and would have focused on research that didn't need computation. Or they'd use some expensive, proprietary tool.
Prior to Python becoming popular among them (circa 2005-2007), plenty of good, fast, whatever-metric-you-want languages existed. These academics were not using them.
Python is really what enabled the work to be done.
I treat it pretty much like bash. it's good for scripts. and it's great for serverless tasks like running as an AWS lambda. if you want to run some simple script that queries a DB and/or hits an API on a schedule or triggered by an event, Python is arguably the best way to do that because the handler and interpreter work so well together. even still, you'd get better performance with Node.
I don't like Python for "applications" as much. I was at a place almost 10 years ago now that had Python "microservices" running in Kubernetes. managing performance and dependencies for Python in production is just unnecessarily futile compared to something like Go that's also very accessible syntactically.
Slow is relative. You need to account for the time to write as well, and amortize over the number of times the code is run. For code that is run once, writing time dominates. Writing a Java equivalent of half of the things you can do in Python would be a nightmare.
> For code that is run once, writing time dominates.
where "run once" in the sense you describe is really the case has been rare for me. Often these one off scripts need to process data, which involves a loop, and all of the sudden, even if the script is only executed once, it must loop a million times and all of the sudden it is actually really slow and then I must go back and either beat the program into shape (time consuming) or let it execute over days. A third option is rewriting in a different language, and when I choose to do a 1:1 rewrite the time is often comparable to optimizing python, and the code runs faster than even the optimized python would've. Plus, these "one off" scripts often do get rerun, e.g. if more data is acquired.
Java is a sort of selective example. I find JavaScript similarly fast to write and it runs much faster.
Just for example, the lack of explicit variable defs. I create a variable and update it later. I rename the variable but forget to change it where it is updated. No errors produced in the IDE because the place where it is updated is now just creating the variable instead so all the references still work. `let` is a good keyword.
It drives me crazy how everything just... blocks. Parallelization is a disaster.
I won't comment much on indentation. All the string formatting syntaxes is crazy. Private names managed by `_`. The type system still mostly sucks. Ect ect.
In my experience it is alright to write short scripts but complexity very quickly balloons and python projects are very often unwieldy.
I would write a script in python, but only if I was positive I would never need to run or edit it again. Which is very rarely the case.
> No errors produced in the IDE because the place where it is updated is now just creating the variable instead so all the references still work. `let` is a good keyword.
I too would prefer let. But the number of times what you mentioned has bitten me in almost 20 years of programming Python can be counted on in one hand. And these days, just use the "rename" feature in your IDE!
JavaScript is like that. C++ was kinda like that. I think "Popularity among novices" is the only thing that determines whether a language will succeed or fail in the long term (say, 20 years)
JavaScript has gotten drastically better, especially since ES6. The lack of venvs alone is such a breath of fresh air. The baseline typing sucks more than python, but typescript is so much better. It still suffers from legacy but the world has mostly moved on and I don't see beginners learning legacy stuff much anymore. Modern javascript is amazing. I do my work in Elysia, Kysely (holy crap the python database situation is awful) and everything just works and catches tons of errors.
I have written code in Pascal, C, C++, Java, TypeScript, PHP and Python in my life.
Of this entire pack, Python seems to have the widest ecosystem of libraries. I don't think I ever ran into a "have to reinvent the wheel" problem.
Need to run four test operations in parallel? asyncio.gather(). Need to run something every five minutes? schedule.every(). In most cases, it is a one-liner, or at most two-liner, no sophisticated setup necessary, and your actual business logic isn't diluted by tons of technical code.
Performance-critical parts can be always programmed in C and invoked from the Python code proper.
From time to time I am peeking over from VS Code to PyCharm and recently I have been surprised how much it seems to fall behind. No (official) ruff integration, you have to use 'External Tools', which is not part of the backup&sync feature. Seriously?
Python’s popularity seems to me driven by ML and data science.
Personally, I can’t take seriously any language without a good type system and, no, optional type hints don’t count. Such a type system should express nullability and collection parameterization (ie genetics).
I simply won’t write a lot of code in any language where a typo or misspelling is a runtime error.
I've been working with python professionally for just over a year now and the main difference from having come from a .net background seems to be that tests are far easier to write and more prevalent in our codebase. I'm sure some of that is cultural, but "runtime" errors are quickly found by any reasonable test suite.
That said, it always saddens me that ML (as in oCaml and F#) don't get more love. They basically can hit all the same bases with ease of readability, multi paradigm, module first programming that python can but just never got the same love.
I think you got it backwards: Python was used in ML and data science because it was already one of the preferred languages. A single line can do the work of whole functions in other languages, while remaining clear.
Also, the tooling is abundant and of great quality, it made it the logical choice.
Most people don't run any Python programs on their machines except for OS package managers.
Google and others moved to Go, and the Python job market does not reflect these statistics at all.
Python is well marketed, with dissenting voices silenced, de-platformed and defamed with PSF involvement. That way many users think the Python ruling class are nice people. It is therefore popular among scientists (who buy expensive training courses) and students (who are force fed Python at university).
It has a good C-API, which is the main reason for its popularity in machine learning. Fortunately for Python, other languages do not take note and insist on FFIs etc.
EDIT: The downvotes are ironic given that Python needs to be marketed every three day here with a a statistic to retain its popularity. If it is so popular, why the booster articles?
I work in a chemistry research field. Most people I know run Python programs for their research. No one I know uses Go. I only know a handful who use Java. Rust and Julia are oddities that appear occasionally.
Sure, we have very different experiences. But that also means that unless you can present strong survey data, it's hard to know if your "Most people" is limited to the people you associate with, or is something more broadly true.
The PSF overlap with my field is essentially zero. I mean, I was that overlap, but I stopped being involved with the PSF about 8 years ago when my youngest was born and I had no free time or money. In the meanwhile, meaningful PSF involvement became less something a hobbyist project and something more corporatized .. and corporate friendly.
> scientists (who buy expensive training courses)
ROFL!! What scientists are paying for expensive training courses?!
I tried selling training courses to computational chemists. It wasn't worth the time needed to develop and maintain the materials. The people who attended the courses liked them, but the general attitude is "I spent 5 years studying <OBSCURE TOPIC> for my PhD, I can figure out Python on my own."
> who are force fed Python at university
shrug I was force fed Pascal, and have no idea if Wirth was a nice person.
> main reason for its popularity in machine learning
I started using Python in the 1990s both because of the language and because of the ease of writing C extensions, including reference counting gc, since the C libraries I used had hidden data dependencies that simple FFI couldn't handle.
I still use the C API -- direct, through Cython, and through FFI -- and I don't do machine learning.
> If it is so popular, why the booster articles?
It's an article about a company which sells a Python IDE. They do it to boost their own product.
With so many people clearly using Python, why would they spend time boosting something else?
This is almost certainly LLM generated.
Six flowery “from-to”s in one article:
>from beginners to seasoned professionals
>from seasoned backend engineers to first-time data analysts
>from GPU acceleration and distributed training to model export
>from data preprocessing with pandas and NumPy to model serving via FastAPI
>from individual learners to enterprise teams
>from frontend interfaces to backend logic
And more annoyingly, at least four “not just X, but Y”.
>it doesn’t just serve as a first step; it continues adding value
>that clarity isn’t just beginner-friendly; it also lowers maintenance costs
>the community isn’t just helpful, it’s fast-moving and inclusive
>this network doesn’t just solve problems; it also shapes the language’s evolution
And I won’t mention the em-dashes out of respect to the human em-dash-users…
This stuff is so tiring to read.
I never really took Python seriously, but lately I've been programming almost all of my personal projects in Python. The way I see it: For any kind of project I might take on, Python is never the best language to do it in, but is almost always the second-best language to do it in. This makes it a great default, especially if it's just a little one-off tool or experiment.
> Python is never the best language to do it in, but is almost always the second-best language to do it in.
I've been writing python from the last century and this year is the first time I'm writing production quality python code, everything up to this point has been first cut prototypes or utility scripts.
The real reason why it has stuck to me while others came and went is because of the REPL-first attitude.
A question like
is much harder to demonstrate in other languages.The REPL isn't just for the code you typed out, it does allow you to import and run your lib functions locally to verify a question you have.
It is not without its craziness with decorators, fancy inheritance[1] or operator precedence[2], but you don't have to use it if you don't want to.
[1] - __subclasshook__ is crazy, right?
[2] - you can abuse __ror__ like this https://notmysock.org/blog/hacks/pypes
lol, here it is in the https://raku.org repl…
Raku uses a rational type by default for those which will give an exact value. If you use Python's Fraction type it would be equivalent to your Raku. The equivalent in Raku to the Python above would be:
Which will evaluate to True.well, yes - but the funny thing is that the example chosen to illustrate the convenience of a REPL is - errr - factually wrong
a scientist knows that 0.1 + 0.2 is not greater than 0.3, only a computer geek would think that this is OK
This is the third Raku reference I came across since morning.
Happy to see Raku getting some press.
I might be wrong, but I was under the impression that result is platform-dependent?
https://docs.python.org/3/tutorial/floatingpoint.html#floati...
Raku represents 0.1, 0.2, and 0.3 internally as rational numbers. https://docs.raku.org/type/Rat. 1/10 + 1/5 == 3/10
Note that "On overflow of the denominator during an arithmetic operation a Num (floating-point number) is returned instead." A Num is an IEEE 754 float64 ("On most platforms" says https://docs.raku.org/type/Num)
Python always uses IEEE 754 float64, also on most platforms. (I don't know of any Python implementation was does otherwise.) If you want rationals you need the fractions module.
This corresponds to Raku's FatRat, https://docs.raku.org/type/FatRat. ("unlike Rat, FatRat arithmetics do not fall back Num at some point, there is a risk that repeated arithmetic operations generate pathologically large numerators and denominators")good explanation
that said, decimals (eg 0.1) are in fact fractions, and the subtlety that 0.1 decimal cannot be precisely represented by a binary floating point number in the FPU is ignored by most languages where the core math is either integer or P754
bringing Rational numbers in as a first class citizen is a nice touch for mathematicians, scientists and so on
another way to look at it for Raku is that
Just about anything that has a repl has a better repl than Python.
Given how slow Python is, isn't it embarrassing that 0.2 + 0.1 > 0.3 ?
I have some test Rust code where I add up about a hundred million 32-bit floating point numbers in the naive way, and it takes maybe a hundred milliseconds, and then I do the same but accumulating in a realistic::Real because hey how much slower is this type than a floating point number, well that's closer to twenty seconds.
But if I ask Python to do this, Python takes about twenty seconds anyway, and yet it's using floating point arithmetic so it gets the sum wrong, whereas realistic::Real doesn't because it's storing the exact values.
If you have a hundred million integers to add, please, by all means, use Rust, or C, and intrinsics for the features not yet in your favorite math libraries. You can call Rust from Python, for instance. Polars is excellent, BTW. This seamless integration between a nice language amenable to interactive experimentation and highly optimized code produced in many other languages I don't want to write code in is what makes Python an excellent choice in many business cases.
At least with BigInts, CPython implementation is quite fast. See https://www.wilfred.me.uk/blog/2014/10/20/the-fastest-bigint...
https://0.30000000000000004.com/#rust
Er, yes, I'm aware why this happened, my point is that this happens in the hardware floating point, but Python is as slow as the non-accelerated big rationals in my realistic::Real (it's actually markedly slower than the more appropriate realistic::Rational but that's not what my existing benchmarks cared about)
> this happens in the hardware floating point
Not really. It's a limitation of the IEEE floating point format used in most programming languages. Some numbers that look nice in base 10 don't have an exact representation in base 2.
Rational numbers where the denominator is a multiple of the prime factors of the base have an exact fractional representation in that base.
1/3 doesn't have an exact representation in base 10 or base 2. 1/5th does have an exact representation in base 10 (0.2), but doesn't in base 2. 1/4th has an exact representation in base 10 (0.25) and in base 2 (0.01)
That said, changing how you think about programming... even with jshell I still think Java in classes and methods (and trying to pull in larger frameworks is not as trivial as java.lang packages). However, I think Groovy (and a good bit of Scala) in a script writing style.
jshell itself is likely more useful for teaching than for development - especially once you've got a sufficiently complex project and the ide integration becomes more valuable than the immediate feedback.
Still, something to play with and one of the lesser known features of Java.
>>> 0.2 + 0.1 > 0.3
That is false. What is it that is "...is much harder to demonstrate in other languages?I am missing something
lol, here it is in the https://raku.org repl…
> That is false.
What's false about it? That is the result if you're using IEEE floating point arithmetic.
It's the result if your programming language thinks 0.2 + 0.1 means you want specifically the 64-bit IEEE floating point binary arithmetic.
But, where did we say that's what we want? As we've seen it's not the default in many languages and it isn't mandatory in Python, it's a choice, and the usual argument for that choice would be "it's fast" except, Python is slow, so what gives ?
You aren't answering my question. I asked worik why they're claiming that something that happens is false. It's insane to claim that reality isn't reality. Run that in a Python REPL and that is the result you get.
Interesting observation about Python providing the worst of all possible worlds: unintuitive arithmetic without any of its speed advantages.
But in answer to “where did we say that's what we want?” I would say, as soon as we wrote the expression, because we read a book about how the language works before we tried to use it. Αfter, for example reading a book¹ about Julia, we know that 0.1 + 0.2 will give us something slightly larger than 0.3, and we also know that we can type 1//10 + 2//10 to get 3//10.
[1] https://lee-phillips.org/amazonJuliaBookRanks/
> I would say, as soon as we wrote the expression, because we read a book about how the language works
I'm comfortable with that rationale in proportion to how much I believe the programmer read such a book.
I haven't taken the course we teach say, Chemists, I should maybe go audit that, but I would not be surprised if either it never explains this, or the explanation is very hand-wavy, something about it not being exact, maybe invoking old fashioned digital calculators.
The amusing thing is when you try to explain this sort of thing with a calculator, and you try a modern calculator, it is much more effective at this than you expected or remember from a 1980s Casio. The calculator in your modern say, Android phone, knows all the stuff you were shown in school, it isn't doing IEEE floating point arithmetic because that's only faster and you're a human using a calculator so "faster" in computer terms isn't important and it has prioritized being correct instead so that pedants stop filing bugs.
It's a beautiful and expressive language, and I use it a lot. My only gripe is how prone it is to bit rot, too sensitive to version changes in libraries or runtimes. It needs frequent babysitting if you want to keep everything updated.
Even if it's less secure, the beauty of Go, a single, static binary impervious to version changes, is very appealing for small projects you don't want to keep returning to for housekeeping.
I have written a not inconsiderable amount of Python. But the problem I found was that I spent too much time debugging runtime issues and I do not enjoy that. I found that while it took longer to write correct Rust to solve problems I wanted solving, I enjoyed that so it was a better deal.
That's a pretty good way to frame it. The speed of developing to me justifies the "second best" aspect.
Because:
1: Simple is better than complex.
2: Beautiful is better than ugly.
3: Readability counts.
Winners across many markets seem to get the importance of simplicity. Compare the Google homepage to the Bing homepage. The dashboard of a Tesla to that of a Volkswagen. Remember how hardcore lean Facebook's UI was compared to MySpace?
Tesla's dashboard is not simple. Having a touchscreen for everything instead of physical buttons is a travesty.
>The dashboard of a Tesla to that of a Volkswagen
Dude!
You think a touch screen tablet replacing all the knobs and tactile buttons is actually a step forward?
The main reason I don’t like Python very much (besides performance) is that it makes what should be simple tasks complex due to the expression problem, arising from its use of class-based object orientation. You can avoid some of the issues in your own programs (but can’t escape insanity such as ",".join(["a", "b"])), but as soon as you delve into a library of any complexity to make alterations, you’re mired in a morass of methods encased in classes inheriting from other classes, and nothing is straightforward. Discovering Julia, which solves the expression problem¹, was enlightening. Even if it were as slow as Python, I would still prefer it: it is simpler, more beautiful, and more readable.
https://arstechnica.com/science/2020/10/the-unreasonable-eff...
My use case is scientific computing and for that Python is excellent thanks to Numpy, IPython and Numba. It's often possible to write code that is nearly as fast as C, but it's far easier to write and debug, since you can just inspect and run code snippets in the shell. In that regard, it's similar to MATLAB, but it's FOSS and faster with the right libraries.
My background is in philosophy and formal logic. The idea that python isn’t the #1 language used is insane to me.
Yea, I’m sure there is a lot of technical reasons to use other languages, but with python, you can just read it. I remember buying “learn python the hard way” about 15 year ago, and just looking through the book thinking… wait, I can already read this.
Natural language parallels are huge.
I'm surprised at the negative comments about Python here. Python has been my favorite language since I learned it, and nothing else has come close to it.
I'm currently on pure JS project (node and vue) and I'm constantly fighting with things that would be trivial in python or django. Also getting into .NET world and not impressed with that at all.
I know folks like Go, but in decades of being a software consultant I've come across zero companies using it or hiring for it.
Care to give some examples of those trivial things you’re fighting?
Regardless of the reasons why, the fact is python works well enough.
There are many things I wish python, or a language like python, could improve on. Yet despite all my wishes, and choosing Rust/Go more often recently, Python still works.
I’m in a love/hate relationship with python. I think it’s becoming more of acceptance now haha.
[dead]
For me, a python user since the late '90s, the answer has always been simple:
Guido has taste.
He stepped back and now we have the walrus operator.
At least we don't have to use it.
Guido approved the walrus. It was the negative response which he said led to him quitting.
Casual python user here. I wasn't aware of this controversy.
Why was there a backlash for this operator? (looks kinda neat). Was it breaking things?
I am not a keyboard warrior who got caught up in the nonsense, but I think some people were simply annoyed at adding syntactic sugar for very marginal benefit. “There should be one way to do things” mantra.
I have a long list of grievances with Python, but the walrus situation would never crack my top ten. Put effort into removing cruft from the standard library, make typing better, have the PSF take a stance on packaging. Anything else feels a better use of time.
Whatever, it won. I will never use it, but when I see it will have to scratch my head and lookup the syntax rules.
I think Python's centrality is a consequence of its original purpose as a language intended for instruction.
Yeah, some of its design decisions required immense cost and time to overcome to make for viable production solutions. However as it turns out, however suboptimal it is a language, this is quite made up by the presence of a huge workforce that's decently qualified to wield it.
Python original purpose was as a scripting language for Amoeba. Yes, it was strongly influenced by ABC, an introduction programming language which van Rossum helped implement, but that was a different language.
https://docs.python.org/3/faq/general.html#why-was-python-cr...
""I was working in the Amoeba distributed operating system group at CWI. We needed a better way to do system administration than by writing either C programs or Bourne shell scripts, since Amoeba had its own system call interface which wasn't easily accessible from the Bourne shell. My experience with error handling in Amoeba made me acutely aware of the importance of exceptions as a programming language feature.
It occurred to me that a scripting language with a syntax like ABC but with access to the Amoeba system calls would fill the need."""
Python is the new Pascal.
It is probably the first language for 99% of the computer science students who didn't know any programming before college. And even for those who knew programming, chances are that a lot of them have at least dabbled a little with it.
Perl 6
It's easy to forget how big Perl was before the 10 Year Stall happened. (I.e. when development on Perl 5 was stalled in wait for Perl 6 which never happened. By the time Perl 6 was renamed to Raku, it was too late for Perl 5. It now lives on, but it lost a lot of momentum.)
Maybe one factor is its versatility in leetcode, which to a first approximation every SWE has to do.
Declare sets, lists, and maps in one line. Don’t need to worry about overflow or underflow. Slicing is extremely convenient and expressive. Types not necessary, but that’s rarely confusing in short code blocks.
Compare to js, you’re immediately dealing with var/let/const decisions using up unnecessary mental energy.
I've been programming only in Python for a while now. I got a certificate that required digging into the language enough to do the code in your head. I code projects in VS Code. I've enjoyed it, esp library availability.
I do warn people that it's not as easy or intuitive as advertized. I've often been bitten by unexpected errors. I think a survey of these might be worthwhile.
One was typing or conversion errors. Some conversions, like int-to-str in a string concantenation, seem pretty obvious. That isnumeric() didn't consider negative numbers as numbers was a surprise.
Sometimes it's consistency. I've often alternated between lists and sets in applications. I prefer to keep most data as a list but use sets for uniqueness checks or redundancy filtering. Despite being collections, one uses .append() and one uses .add(). Little differences not only add confusion: I have to modify my codebase if mixing or switching them which can become a type error later in another spot.
Another were common operations usually in one place were split across two. That happened with time vs datetime and filesystem operations which might take two modules. I've considered making a wrapper that turns all those into one thing with internal redundancy removed. There might be a reason others haven't done that, though.
Another issue was distribution. I can do straightforward building of console apps for two platforms. That's reliable. If worried about reliable, Python apps seemed easier to deliver as a Flask site than distribute my utilities as standalone. Nikita was really impressive, though, in terms of the work that must have went into it.
In my case, I also missed the ability to easily make linked lists in C to build trees. I wanted to build a C-like tree in Python but it couldnt do self-referential structures IIRC. Since that app requirements were C-like, and performance was no issue, I actually simulated a C-like heap in Python, ported a C tree to it, and build the tool on that. I also got a use-after-free in Python of all things lol. Anyway, I was surprised there was a data structure C could do but a high-level, GC, reflective language couldn't. There might be a trick for this others know about but I figure they just wrap C code for those things.
On the positive side, the debugging experience with Python was so much better than some beginner languages. I'm grateful for the work people put into that. I also love that there are easy to use accelerators, from JIT's to the C++ generator.
I was wanting an acceleratable subset with metaprogramming when Mojo appeared. I might try it for toy problems. I gotta try to stay in idiomatic-ish Python for now, though, since it's for career use.
I continue to believe that python is only still popular for the ecosystem effect. Students are taught it, a bunch of libraries were written for it, now everyone keeps using it.
But its syntactically weak? Python itself is slow? pip is awful (although uv is pretty good). Sometimes I am forced to write python because the packages I need are written for it but I always hate it.
> Students are taught it, a bunch of libraries were written for it, now everyone keeps using it.
This is revisionism.
When I was in school, that's what people would say about C/C++/Java.
People like me switched to Python well before it became common to teach it at school. Lots of great libraries were written in it before then as well. I mean, it's really easy to write a library in it than it was in most other languages.
It was picked for teaching in schools because it was a decent language, and was already widespread. It's much more useful to teach Python to an average engineering student than C/C++/Java.
It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language: Perl. When I was in grad school, I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. There is no way they would have done that in any of the prevailing languages of the time - many of them learned to program while in grad school. It became quite popular amongst linguists. Then NumPy/SciPy became a popular alternative to MATLAB, and once again my peers, who would never write C++ to do their work, were suddenly getting matching speed by using Python. That's how it ended up being the language for ML.
So no - the fact that it's taught in schools is not the reason it's used today.
> It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language ... I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. ... many of them learned to program while in grad school.
Sure, and this is my argument. It is easy to start out with, which makes it appealing to academics without a CS background. But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.
Just because it is easy to write doesn't mean it is the most appropriate for science. It is not commonly acceptable to publish papers without a standardized format and other conventions. Students put in work to learn how to do it properly. The same should be true for code.
> But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.
For many of them, the alternative is that they simply wouldn't have done the science, and would have focused on research that didn't need computation. Or they'd use some expensive, proprietary tool.
Prior to Python becoming popular among them (circa 2005-2007), plenty of good, fast, whatever-metric-you-want languages existed. These academics were not using them.
Python is really what enabled the work to be done.
I treat it pretty much like bash. it's good for scripts. and it's great for serverless tasks like running as an AWS lambda. if you want to run some simple script that queries a DB and/or hits an API on a schedule or triggered by an event, Python is arguably the best way to do that because the handler and interpreter work so well together. even still, you'd get better performance with Node.
I don't like Python for "applications" as much. I was at a place almost 10 years ago now that had Python "microservices" running in Kubernetes. managing performance and dependencies for Python in production is just unnecessarily futile compared to something like Go that's also very accessible syntactically.
> Python itself is slow
Slow is relative. You need to account for the time to write as well, and amortize over the number of times the code is run. For code that is run once, writing time dominates. Writing a Java equivalent of half of the things you can do in Python would be a nightmare.
> For code that is run once, writing time dominates.
where "run once" in the sense you describe is really the case has been rare for me. Often these one off scripts need to process data, which involves a loop, and all of the sudden, even if the script is only executed once, it must loop a million times and all of the sudden it is actually really slow and then I must go back and either beat the program into shape (time consuming) or let it execute over days. A third option is rewriting in a different language, and when I choose to do a 1:1 rewrite the time is often comparable to optimizing python, and the code runs faster than even the optimized python would've. Plus, these "one off" scripts often do get rerun, e.g. if more data is acquired.
Java is a sort of selective example. I find JavaScript similarly fast to write and it runs much faster.
It's syntactically strong.
AFAIK, it's the preferred language for lots of leetcode, e.g. Advent of Code. The practical expressivity and speed of dev is strong.
Just for example, the lack of explicit variable defs. I create a variable and update it later. I rename the variable but forget to change it where it is updated. No errors produced in the IDE because the place where it is updated is now just creating the variable instead so all the references still work. `let` is a good keyword.
It drives me crazy how everything just... blocks. Parallelization is a disaster.
I won't comment much on indentation. All the string formatting syntaxes is crazy. Private names managed by `_`. The type system still mostly sucks. Ect ect.
In my experience it is alright to write short scripts but complexity very quickly balloons and python projects are very often unwieldy.
I would write a script in python, but only if I was positive I would never need to run or edit it again. Which is very rarely the case.
> No errors produced in the IDE because the place where it is updated is now just creating the variable instead so all the references still work. `let` is a good keyword.
I too would prefer let. But the number of times what you mentioned has bitten me in almost 20 years of programming Python can be counted on in one hand. And these days, just use the "rename" feature in your IDE!
Well it’s as close to executable pseudocode as it gets, which I would say is both a positive and a negative.
JavaScript is like that. C++ was kinda like that. I think "Popularity among novices" is the only thing that determines whether a language will succeed or fail in the long term (say, 20 years)
> JavaScript is like that.
JavaScript has gotten drastically better, especially since ES6. The lack of venvs alone is such a breath of fresh air. The baseline typing sucks more than python, but typescript is so much better. It still suffers from legacy but the world has mostly moved on and I don't see beginners learning legacy stuff much anymore. Modern javascript is amazing. I do my work in Elysia, Kysely (holy crap the python database situation is awful) and everything just works and catches tons of errors.
I have written code in Pascal, C, C++, Java, TypeScript, PHP and Python in my life.
Of this entire pack, Python seems to have the widest ecosystem of libraries. I don't think I ever ran into a "have to reinvent the wheel" problem.
Need to run four test operations in parallel? asyncio.gather(). Need to run something every five minutes? schedule.every(). In most cases, it is a one-liner, or at most two-liner, no sophisticated setup necessary, and your actual business logic isn't diluted by tons of technical code.
Performance-critical parts can be always programmed in C and invoked from the Python code proper.
From time to time I am peeking over from VS Code to PyCharm and recently I have been surprised how much it seems to fall behind. No (official) ruff integration, you have to use 'External Tools', which is not part of the backup&sync feature. Seriously?
Something like uv has been long, long over due.
Python’s popularity seems to me driven by ML and data science.
Personally, I can’t take seriously any language without a good type system and, no, optional type hints don’t count. Such a type system should express nullability and collection parameterization (ie genetics).
I simply won’t write a lot of code in any language where a typo or misspelling is a runtime error.
I've been working with python professionally for just over a year now and the main difference from having come from a .net background seems to be that tests are far easier to write and more prevalent in our codebase. I'm sure some of that is cultural, but "runtime" errors are quickly found by any reasonable test suite.
That said, it always saddens me that ML (as in oCaml and F#) don't get more love. They basically can hit all the same bases with ease of readability, multi paradigm, module first programming that python can but just never got the same love.
I think you got it backwards: Python was used in ML and data science because it was already one of the preferred languages. A single line can do the work of whole functions in other languages, while remaining clear.
Also, the tooling is abundant and of great quality, it made it the logical choice.
Most people don't run any Python programs on their machines except for OS package managers. Google and others moved to Go, and the Python job market does not reflect these statistics at all.
Python is well marketed, with dissenting voices silenced, de-platformed and defamed with PSF involvement. That way many users think the Python ruling class are nice people. It is therefore popular among scientists (who buy expensive training courses) and students (who are force fed Python at university).
It has a good C-API, which is the main reason for its popularity in machine learning. Fortunately for Python, other languages do not take note and insist on FFIs etc.
EDIT: The downvotes are ironic given that Python needs to be marketed every three day here with a a statistic to retain its popularity. If it is so popular, why the booster articles?
I work in a chemistry research field. Most people I know run Python programs for their research. No one I know uses Go. I only know a handful who use Java. Rust and Julia are oddities that appear occasionally.
Sure, we have very different experiences. But that also means that unless you can present strong survey data, it's hard to know if your "Most people" is limited to the people you associate with, or is something more broadly true.
The PSF overlap with my field is essentially zero. I mean, I was that overlap, but I stopped being involved with the PSF about 8 years ago when my youngest was born and I had no free time or money. In the meanwhile, meaningful PSF involvement became less something a hobbyist project and something more corporatized .. and corporate friendly.
> scientists (who buy expensive training courses)
ROFL!! What scientists are paying for expensive training courses?!
I tried selling training courses to computational chemists. It wasn't worth the time needed to develop and maintain the materials. The people who attended the courses liked them, but the general attitude is "I spent 5 years studying <OBSCURE TOPIC> for my PhD, I can figure out Python on my own."
> who are force fed Python at university
shrug I was force fed Pascal, and have no idea if Wirth was a nice person.
> main reason for its popularity in machine learning
I started using Python in the 1990s both because of the language and because of the ease of writing C extensions, including reference counting gc, since the C libraries I used had hidden data dependencies that simple FFI couldn't handle.
I still use the C API -- direct, through Cython, and through FFI -- and I don't do machine learning.
> If it is so popular, why the booster articles?
It's an article about a company which sells a Python IDE. They do it to boost their own product.
With so many people clearly using Python, why would they spend time boosting something else?
Yeah brother, down with Big Python!