MS Dos 5.0 on my first PC was a bit short on features and I had not enough money for Windows 3.1... I heard that American students were using something called Unix and that their was something close available through mail-order CDs. Yggdrasil CDs were cheap too!
Given this trend, GPT 5 or 6 will be trained on a majority of content from its previous versions, modeling them instead of the expect full range of language. Researchers have already tested the outcome of a model-in-loop with pictures, it was not pretty.
You mean Microsoft will recoup the cost of unbundling by charging more per product compared to the previous bundle, given that it's now different products?
'cause at work the powers that be has gone all-in on MS and this decision won't change a bit their "strategy".
Each and every line of code you write is a liability. Even more so when you wrote it for someone else. You must always be able to rebuild it from source, at least as long as your client expect the software to work. If you feel it's not worth it, you probably low-balled the contract. If you don't want to maintain code, have the client pay a yearly maintenance fee, give the code and the responsibility to maintain it to your client at the end of development, or add a time limit to it's support.
There's no "maintenance mode" software: either it's in use and must be kept updated with regard to it's execution environment, or it's not used anymore and can be erased and forgotten. Doing differently opens too much security issues, which shouldn't be acceptable for us all as a trade.
Forewarning : ops here, I'm one of the few the bosses come to when the "quick code" in production goes sideways and the associated service goes down.
soapbox mode on
Pardon my french but that's a connerie.
Poorly written code, however fast it has been delivered, will translate ultimately into a range of problems going from customer insatisfaction to complete service outage, a spectrum of issues far more damageable than a late arrival on the market. I'd add that "quick and dirty code" is never "quick and dirty code with relevant, automated, test coverage", increasing the likelihood off aforementioned failures, the breadth of their impact and the difficulty to fix them.
Coincidentally , any news about yet another code-pissing LLM bothers me a tad, given that code-monkeys using such atrocities wouldn't know poorly written code from a shopping list to begin with, thus will never be able to maintain the produced gibberish.
Trails and the invisibility of Versys
Just stumbled upon a midsize japanese trails comparison, and the Versys was distinctly absent. A V-Strom, Tenere and Transalp, all different and with their strengths, be it off or on road, and yet no mention of my very subjectively beloved Versys.
So may I ask this venerable assembly it's opinion on the subject? How come the Versys seems to fly under the radar, what is it lacking to be noticable - or more egoistically, what fun am I missing with mine?
Secrets don't belong anywhere inside an application code. They're related to the runtime environnement - 'cause you don't use the same password for production and integration, right? - and should come from an external configuration source. That might be as simple as environment variables.
Application deployment should never require modification of a file that resides inside the application itself. PHP and other interpreted languages has a tendancy to promote laziness when it comes to proper release management.
And don't start with "but it makes development complicated": fix your onboarding and then tooling instead of putting the security of your users and customers at risk.
That's a lot of words for "I'm too lazy to master the most essential tool of my professional life and keep it updated to my requirements".
If you don't want to do it, feel free to pay an Ubuntu support subscription, open a ticket, and get back to work. As you said: you should be working on your problem instead of whining. Or maybe you earn more whining?
There's a saying that goes like that: "To a bad workman, there's always bad tools."
EOL of version 7 is next year in June, you got a nice pile of work here!
And *nix shells a perfect example of the KISS principle.
On behalf of garbage, I loudly protest on this attempt to assimilate it to Powershell.
You can add support contract requirements for some pieces of software coming from vendors with so little confidence in their product that they're rather have it run on an outdated dependencies environnement. A side effect of the logic you talked about, applied to software vendors.
Yet again an example of (for some, not-so) old, control-freak farts that just don't understand the world they live in. The law proposal is entitled "Regulation of digital space". As if a country could regulate an international network.
Sometimes I'm really ashamed of our politicians.
Unplug your mouse. Seriously. Do it. It might sound like the "kicking and screaming" method but you'll learn to rely on your keyboard even for GUI tools and you'll vastly improve how fast you navigate your computer. You should find yourself more and more in the terminal, obviously, but you may learn also some nice tricks with everything else.
Simple: because it goes against the KISS principle. The GNU tools that constitute the user interface to the system comes from a philosophy that started with Unix: simple tools, doing one thing well, communicating through "pipes" - i.e. the output of one tool is supposed to be used as the input of the next one.
Such philosophy allows to assemble complex logic and workflows with a few commands, automating a lot of mundane tasks, but also allowing to work at a large scale the same way you would work on a few files or tasks.
Graphical tools don't have such advantages:
- UI are rarely uniform in their presentation or logic, as there's so much way to present options and choices;
- Apple did something nice in the way of automating with AppleScript, but I've not encountered anywhere else. GUIs are rarely automatable, which means you'll need some clicking and pushing buttons if a task has to be repeated - or the GUI has to be altered to be able to replay a set of commands for multiple items;
- interconnecting different GUIs so that they can exchange data is just impossible. You usually end up with files in dedicated format, and the needs to massage data from one format to another to be able to chain tasks from different GUIs
- more importantly, command line work with minimal bandwidth and tooling on the client side. Tmux, Mosh and similar tools allow to work with an intermittent connection, and have a very low impact on the managed system;
- in some specific fields - notably embedded and industrial systems - you just can't justify allocating resources just for a graphical environment. On these system, CLI is as powerfull as on a full fledged server, and don't requires stealing precious resources from the main purpose of the system.
Beware though: as time passes, Unix founding principles seems to get forgotten, and some CLI tools manifest a lack of user experience design, diverging from the usual philosophy and making the life of system administrators difficult. I've often observed this on tools coming from recent languages - python, go, rust - where the "interface" of the tools is closer to the language it's written with than the CLI uniform interface.
Nice idea! Moreso that despite software-related AI seems to be more and more opensourced, it really looks to me that it's seriously lacking on the hardware side. Opensource code running on proprietary hardware is a good start, but we'd all be better if it was open and public through and through.
Conjugaisons. Imparfait du subjonctif. 'nough said!
Try French: same written word pronounced differently depending on its meaning - that might only deduced from context, differently written words pronounced the same - and having different meaning, obviously... and there's always poetic license if you wish to muddle things a bit more!
The example is more of an issue with using the right tool for the job: using classes inheritance and overloading would properly deal with the employee/manager/c-suite distinction.
All recent CPUs have native virtualization support, so there's close to no performance hit on VMs.
That being said, even a VM is subject to exploits and malicious code could break out of the VM down to its hypervisor.
The only secure way of running suspicious programs starts with an air-gaped machine, a cheap hdd/ssd that will go straight under the hammer as soon as testing is complete. And I'd be wondering even after that if maybe the BIOS might have been compromised.
On a lower level of paranoia and/or threat, a VM on an up-to-date hypervisor with a snapshot taken before doing anything questionable should be enough. You'd then only have to fear a zero day exploit of said hypervisor.