# Computer Operating System Choices.



## harsh (Jun 15, 2003)

The latest Lansweeper report (27 million Windows devices) says that Windows 11 (3% share) isn't as popular as Windows 7.

The numbers for Windows 11 in businesses is 2.5% while the home market is 3%.

Where are personal computer users going to go?

If you choose other, please offer up some specifics (i.e. FreeBSD)


----------



## dtv757 (Jun 4, 2006)

ill stay on windows not a fan of apple stuff


----------



## P Smith (Jul 25, 2002)

Follow that OS what pre-installed by mfg, if buying new computer.


----------



## b4pjoe (Nov 20, 2010)

My main computer I use the most is a Mac and is running Mac OS Monterey. I have PC's running Windows, 7, 10, 11. The PC running Windows 11 also can run Mac OS Big Sur. And a really old Dell running Windows 98. I have an Acer Chromebook. I also have Parallels running on my Mac with installs of earlier versions of Mac OS X, Ubuntu, CentOS and DOS 6.2 and the 3 versions of Windows listed previously.


----------



## NYDutch (Dec 28, 2013)

My "daily driver" OS has been Linux, specifically PCLinuxOS, since 1999. I do keep a copy of Win 10 installed in the VirtualBox emulator to support family members that haven't seen the light yet.


----------



## audiomaster (Jun 24, 2004)

b4pjoe said:


> My main computer I use the most is a Mac and is running Mac OS Monterey. I have PC's running Windows, 7, 10, 11. The PC running Windows 11 also can run Mac OS Big Sur. And a really old Dell running Windows 98. I have an Acer Chromebook. I also have Parallels running on my Mac with installs of earlier versions of Mac OS X, Ubuntu, CentOS and DOS 6.2 and the 3 versions of Windows listed previously.


Am I right that you need to be a more "computer savvy" person to use Linux than Windows?


----------



## harsh (Jun 15, 2003)

audiomaster said:


> Am I right that you need to be a more "computer savvy" person to use Linux than Windows?


Not a whole lot for basic things like web browsing and e-mail but if you seek to run specific Windows applications versus similar Linux-based applications, Linux doesn't make that easy (just as Windows doesn't make some basic functions as easy as they should be). At the same time, some things in Linux are just easier all the way around.

The happy part about Linux is that you don't have to perform a long installation that completely wipes out your existing system just to try it out. You can boot a modern Linux from a USB key in a minute or so and be off and running with your network printers automatically configured.


----------



## harsh (Jun 15, 2003)

I'd like to see more explanations of "Other" since it seems to be tied for second place.


----------



## billsharpe (Jan 25, 2007)

Both my 2-year-old desktop and new laptop are running Windows 11. I also have a 12-year-old laptop that is now running Linux. I also have an iPad and iPhone. I'm reasonably happy with all of these.


----------



## NYDutch (Dec 28, 2013)

audiomaster said:


> Am I right that you need to be a more "computer savvy" person to use Linux than Windows?


A several years ago I taught an introduction to computers class at a senior center. My students ranged from their late 60's to their mid 90's. As part of the course, I walked them through installing PCLinuxOS on a variety of older donated PC's that they could take home. As we got into using the OS, the most common comment I got from those with some familiarity with Windows, was how similar the KDE desktop environment was to the Windows environment. We covered installing and using a variety of common Windows equivalent programs, such as LibreOffice instead MS Office, Thunderbird email instead of MS Outlook, and the Firefox browser instead of MSIE (Chrome hadn't taken off yet). One of the highlights for me a few years later was receiving an email from a former student containing an invitation to her 100th birthday party where she mentioned how much she was still enjoying her Linux installation. Sadly, she died just before her 101st birthday. The point though, is that none of these people were "computer savvy", yet they all found Linux quite useful in their everyday lives. Until I lost track, only a couple had switched to Windows, at least one of them due to family pressure.


----------



## harsh (Jun 15, 2003)

Too many people buy computers without knowing what software they're going to run and the hope is that it will be somehow safer to go mainstream because that's where everyone else is.

I like to use Thunderbird and Outluck to compare the Free and Open Source Software (FOSS) world to the Micro$oft world. Outluck looks like it was designed by a collage artist while Thunderbird is relatively tidy and quite configurable. Until such time as Libre Office mandates "the ribbon" (hopefully never), it is just so much cleaner than Microsoft 365.

The Micro$oft ribbon metaphor is really a blight on the computer world.

What's with task bars/docks being at the top or bottom of the screen where they steal valuable screen space? This is especially important given the inexplicable popularity of 13" and under notebooks where side-by-side is really only useful for banner advertising.

Sure, there are add-ons and plug-ins to make the mainstream products less nonsensical but they usually end up being kludges and break with each major release; either the program itself, the OS it runs on or both.


----------



## b4pjoe (Nov 20, 2010)

In my experience you can either hide the ribbon and bring it back by moving the mouse to the location where it is at or disable it completely in Microsoft Office/365. I use Libre Office for myself personally but I had to support it for other users at my job. Most of those users couldn't even tell you what operating system they were using.


----------



## renegade (Jul 28, 2011)

ill stay on apple not a fan of windows stuff


----------



## P Smith (Jul 25, 2002)

harsh said:


> Outluck


Outbuck perhaps ?


----------



## harsh (Jun 15, 2003)

P Smith said:


> Outbuck perhaps ?


That makes no sense whatsoever.


----------



## b4pjoe (Nov 20, 2010)

Seriously what is Outluck? I thought it was maybe a typo for Outlook but you typed it that way twice.


----------



## harsh (Jun 15, 2003)

b4pjoe said:


> Seriously what is Outluck? I thought it was maybe a typo for Outlook but you typed it that way twice.


It is an intentional misspelling of Outlook that I feel better represents what you can expect from the product.


----------



## b4pjoe (Nov 20, 2010)

I can agree with you. It is a horror of an app.


----------



## James Long (Apr 17, 2003)

harsh said:


> It is an intentional misspelling of Outlook that I feel better represents what you can expect from the product.


Then outbuck applies too. You will be out many bucks using Microsoft.


----------



## harsh (Jun 15, 2003)

James Long said:


> Then outbuck applies too. You will be out many bucks using Microsoft.


As it is typically something that comes bundled with other Micro$oft products, it usually costs you only eye strain, stomach lining and premature gray hairs.

Micro$oft does offer Outluck as a standalone product but you'd have to be pretty hard up as the MSRP is $159.99 ($90 less than Office Home and Business that adds Word, Excel and Powerpoint). The "security" (a word that Micro$oft should be prohibited from using in their advertising) features require an e-mail account through Micro$oft (@hotmail.com, @live.com, @msn.com or @outlook.com).


----------



## b4pjoe (Nov 20, 2010)

harsh said:


> The "security" (a word that Micro$oft should be prohibited from using in their advertising) features require an e-mail account through Micro$oft (@hotmail.com, @live.com, @msn.com or @outlook.com).


No it doesn't. I've set it up many times where the user is just using our domain email. Not a Microsoft one.


----------



## harsh (Jun 15, 2003)

b4pjoe said:


> No it doesn't. I've set it up many times where the user is just using our domain email. Not a Microsoft one.


I'm speaking uniquely of the "security" features that Microsoft touts in their Outluck feature list:


Micro$oft said:


> Includes advanced security with message encryption and removal of dangerous attachments*
> 
> *Applies to customers who have an @outlook.com, @hotmail.com, @live.com, or @msn.com email address.


Outluck can work with most IMAP servers (and it is Microsoft's only product that tries to deal with POP3 servers) but that's not what I'm talking about.

Outluck can be a bit cranky (though not as cranky is Apples apps) about e-mail services that don't have the appropriate autodiscovery code embedded in a certain place on a website along with some special text in the domain configuration.


----------



## James Long (Apr 17, 2003)

harsh said:


> As it is typically something that comes bundled with other Micro$oft products, it usually costs you only eye strain, stomach lining and premature gray hairs.


One does have the choice of not using (or paying for) any of the expensive Microsoft applications while still using the operating system.


----------



## harsh (Jun 15, 2003)

James Long said:


> One does have the choice of not using (or paying for) any of the expensive Microsoft applications while still using the operating system.


This is a question that I pose to anyone using Office for Mac (not that Pages or Numbers is logically a substitute but who really needs Office?).


----------



## TimeLord04 (Sep 5, 2021)

10-15-2022 at 8:08 AM - PDT

Sorry to be late to the party, here. I'm running MacOS Mojave on my primary system, a Hackintosh that I built in December of 2018. It is an I7 7700K, 32GB DDR4 RAM, has a Fenvi FV-T919 WiFi AC Card that MacOS sees as 100% "Airport Extreme" capable. The system Dual Boots to Win 7 Pro x64. Monitor is a BenQ 4K EL2870U 28". MacOS thinks it's an iMac 27" Retina 5K, Mid 2017 model. I'm running Clover Boot Loader for the MacOS install.

I'm staying on Mojave on the Hackintosh due to the need to run some 32 Bit Apps that later MacOS eliminates support for. However; the hardware IS capable of running the latest MacOS. The Hackintosh has a Hot Swap Bay with all 3 Drive Bays full. MacOS is in the Primary Bay on a Samsung 860 Pro 1TB SSD, Win 7 Pro x64 is in the Secondary Bay on another Samsung 860 Pro 1TB SSD, and the Third Bay has a Western Digital Black 2TB 7200 rpm Hard Drive partitioned into two 1TB Partitions - one for MacOS Time Machine and the other for Games.

My other two Desktops are Mac Pro models. One 5,1 2010 System, Dual Xeon 2.4GHz, 64GB DDR3 ECC RAM, with all 4 Drive Bays full. Three Samsung 860 EVO 1TB Drives, one for MacOS Mojave, one for MacOS Sierra 10.12.1 with the full ADOBE Suite of software, and the third bay with BootCamp for Win 7 Pro x64, and the 4th Bay with one Western Digital Black 2TB Drive set up identical to the Hackintosh WD Drive. Finally, the Mac Pro 3,1 Early 2008 System, Dual Xeon 2.8GHz, 32GB DDR2 ECC RAM, all 4 Drive Bays full. I'm still working on cloning over the OSes from Hard Drives to Samsung SSDs for the MacOS Drives.... One Hard Drive for MacOS Snow Leopard 10.6.8, (just for fun to have it), the second drive is El Capitan 10.11.6, the third Bay is BootCamp with Win 10 Pro on one Samsung 860 EVO 1TB Drive, and Bay 4 is another WD Black 2TB Drive, again set up identical to the Hackintosh Western Digital Drive.

GPUs.... The Hackintosh has a AMD/ATI Radeon Sapphire Pulse RX-580 8GB Card. The 5,1 has an NVIDIA GTX-770 4GB Card. The 3,1 has an NVIDIA GTX-970 4GB Card. The 5,1 currently has an ASUS VE-278 27" 1080p monitor, and the 3,1 has an older Viewsonic 22" 1080p DVI monitor. I'm saving up to replace the ASUS Monitor with a 32" BenQ 4K monitor, then move the ASUS monitor to dad's Desktop System, and move his ASUS 24" to the 3,1 and then retire the Viewsonic.

OH, almost forgot, the 5,1's WiFi Card has been Upgraded to WiFi AC with Bluetooth 4 and the 3,1 has a stock Apple WiFi Card with Bluetooth 2, if memory serves correctly.

I LOVE computer hardware. 

*[EDIT:]* I use LibreOffice on all three systems.


TimeLord04


----------



## P Smith (Jul 25, 2002)

TimeLord04 said:


> I LOVE computer hardware.


I'm curious how much you put out of your packet into all the systems ? And what is your return from all the systems ?


----------



## TimeLord04 (Sep 5, 2021)

P Smith said:


> I'm curious how much you put out of your packet into all the systems ? And what is your return from all the systems ?


12:39 PM - PDT

@P Smith

The Mac Pros I got used and on the cheap. The most expensive system is the Hackintosh, which, again, was built in December of 2018 after having saved up during the year of 2018 for the various parts I wanted. The Hackintosh was $4K to build. I left out of my last post the Motherboard for the Hackintosh, which is a Gigabyte H-270-HD3. Total spent to date on all three systems is about $6.5K. The 3,1 came with the 32GB of RAM, the 5,1 I changed the RAM from the original 12GB to a full change out of new 64GB RAM from iBuildMacs.com. In the $4K on the Hackintosh, includes a GPU change out from an NVIDIA GTX-1070 8GB Card to the AMD/ATI Radeon Sapphire Pulse RX-580 8GB Card due to the 'Apple and NVIDIA wars' which forced Mac users whom wanted to move to Mojave from High Sierra to change out their Maxwell and Pascal NVIDIA GPUs. (*[EDIT:]* Kepler NVIDIA GPUs were 'still allowed' by Apple for OS recognition in MacOS Mojave.)

All the GPUs in these systems are Mac Flashed for Apple Boot Screen capabilities. The GTX-770 came flashed from a friend. The GTX-970 was Flashed by MacVidCards, and the AMD/ATI Card came pre-flashed by MacVidCards, AND the AMD/ATI Card was purchased by me NEW in-box from MacVidCards. The GTX-1070 8GB Card, no longer being used, also came from MacVidCards, and was my most expensive Mac Flashed Card. I got the AMD/ATI Card on a VERY good sale by MacVidCards for $309 Plus Shipping at $14, just before the chip shortage that shot CPUs and GPUs through the roof. The 1070 cost me $555 Plus $14 Shipping from MacVidCards.

All in all, I don't think I've done too badly. I've bought these systems in an effort to keep my IT Tech skills up. My daily driver is the Hackintosh. The 3,1 is an almost identical twin to another 3,1 I got used for my brother. He couldn't afford a 'High End' Desktop system, and his daughter, (my niece), accidentally sat on and broke his laptop. So, I scrounged around and a friend in Alabama, (a university Professor), got me one of the university systems that was being retired for my brother. The other 3,1 that I now have also came from my university friend. My brother's system only has 24GB of RAM, but otherwise is identical to my 3,1 in hardware specs. Upgrades to my brother's 3,1, (new monitor, keyboard, mouse, and SSDs, and a soundbar), brought his 3,1 to about $1,100 grand total. My brother's system has MacOS El Capitan 10.11.6 and BootCamp Windows 10 Pro x64 on Samsung 860 EVO 1TB SSD Drives, and a 4TB Hard Drive formatted in NTFS as a secondary drive to Windows for all the family's data, and a 1TB Hard Drive for MacOS Time Machine. My 3,1 does NOT have the soundbar, but does utilize a spare set of Altec Lansing 42.1 Speaker Set built for computer systems. The Hackintosh has an Altec Lansing 45.1 set of speakers, and the 5,1 has an Altec Lansing ACS-295 set of speakers. All the Altec Lansing speaker sets have Subwoofers included. My brother's soundbar is more modern and is digital with the Digital Optic Cable from the Digital Out of the Mac Pro to the soundbar. *[EDIT 3]* The soundbar is from Yamaha, (bought at Best Buy), and came with a subwoofer.

Again, I'm NOT yet done with the hardware change outs for my 3,1. I still need two more Samsung SSDs for Snow Leopard and El Captain. The 860 EVOs are no longer available, so I have my eyes on the 870 EVO and 870 QVO models. I'm a little strapped for cash at the moment, and may NOT get to these for another several months.

I do the best I can with the money I have, just like everyone else here. I make use of my friends and contacts to the best of my ability.

*[EDIT 2:]* I also forgot to mention that I have two Logitech C920S Webcams, one on the Hackintosh, and one on the 5,1. Bought these, new at $65 and $69 respectively AFTER the COVID issues that shot these Webcams UP to the $400+ range! I waited just about 2.5 years to get these at decent prices. I had dad purchase the same webcam for his Desktop, also at $65. One of mine I got directly from Logitech, (the one for the Hackintosh), and the one for my 5,1 and the one for dad's system we got from Best Buy.


TimeLord04


----------



## P Smith (Jul 25, 2002)

Good essay... so what was the total amount ? 
As I got from above, the ROI is close to $0, it's just a base to keep your A+ skills, not making some income for you.


----------



## TimeLord04 (Sep 5, 2021)

P Smith said:


> Good essay... so what was the total amount ?
> As I got from above, the ROI is close to $0, it's just a base to keep your A+ skills, not making some income for you.


10-16-2022 at 12:47 AM - PDT

I stated that the total invested so far in all 3 computers is $6.5K. Most of this is the Hackintosh. ROI is 3 working systems that I can run various software programs on, including the ADOBE suite on the Sierra 10.12.1 Drive, and 32 Bit Apps and Games like 7th Guest, 11th Hour, and Wing Commander III under Wine from GOG.com.

*[EDIT:]* AND, the 3,1 doubles as a twin machine to my brother's 3,1 so that I can troubleshoot things and duplicate scenarios that he runs into in Windows 10 on his system. (AND, the GPU prices are included in the $6.5K grand total to date.)


TimeLord04


----------



## P Smith (Jul 25, 2002)

Great ! Wish I could afford such expenses and upgrade my old computers ...


----------



## TimeLord04 (Sep 5, 2021)

P Smith said:


> Great ! Wish I could afford such expenses and upgrade my old computers ...


6:49 AM - PDT

@P Smith 

Well, as to 'affording' these... 

I charged my way, WAAAAYYYY into debt on credit cards. My rationale was that with Mac OS X knowledge, I'd be more employable. Hasn't worked out that way, though.
In the meantime, over the last year, since December 2021, I was LUCKY enough to score a 0% Interest, Introductory Offer on another Credit Card. 0% for 15 Months! So,
I grabbed that chance, got approved, and transferred MOST of the higher interest debt on other cards over to the 0% card. I'm now down to owing $1,200, and at $300 a
month, I will have the debt paid off by February. (Two months early, as I have until April of next year!)

So, essentially, I owe, now, about $400 on each computer at 0% Interest until April. At which point, I will then pick up the two SSDs I need to complete my 3,1, and by then,
also have the new 32" BenQ 4K Monitor for the 5,1 so that I can move the ASUS 27" 1080p monitor to dad's system, and move his ASUS 24" 1080p monitor over to my 3,1.

Wish me luck. 


TimeLord04


----------



## P Smith (Jul 25, 2002)

May the force be with you 
I would guess your father in my age and must tell you this... I found 32" 1080p monitor produced very good readable Visual Studio pages of source code for aging eyes 
Those 4k monitors making too much strain for eyes.


----------



## TimeLord04 (Sep 5, 2021)

P Smith said:


> May the force be with you
> I would guess your father in my age and must tell you this... I found 32" 1080p monitor produced very good readable Visual Studio pages of source code for aging eyes
> Those 4k monitors making too much strain for eyes.


10:15 AM - PDT

@P Smith 

I'm 'older' myself, and can tell you that I CANNOT read the small text on my 28" 4K BenQ EL2870U on the Hackintosh without reading glasses! 
I'm hoping that having the 32" 4K BenQ monitor on the 5,1, (still with reading glasses), will be 'easier' to read the text on.... We'll see. (OMG is that
a pun???)

The Real Estate space utilizing 4K, though, is just FANTASTIC. I'm glad I went 4K on the Hackintosh. The ASUS VE-278 27" 1080p monitor originally
was on the Hackintosh. I moved it over to the 5,1 when I upgraded to the BenQ 28". I wish I could get a picture of this 'Attic/Office' space I'm in on
this Site, the 'Desk Space' for the Hackintosh does NOT allow for a monitor larger than the 28" and still allow the Logitech C920S Webcam to comfortably
sit on top of the monitor.... However; the card table, more centered in the room, WILL allow for the 32" BenQ 4K monitor there for the 5,1. The 3,1 sits to the
right of the Hackintosh, and it's 'Desk Space' could comfortably fit a 27" or 28" monitor there.

The 'Attic/Office' is 'reclaimed space' after dad remodeled the house. Access to the 'Attic/Office' is through the upstairs spare bedroom. The 'Attic/Office'
essentially sits above the garage. Because the 'Attic/Office' is above the garage, I'm limited by the slope of the roof. Taller people have to stoop to stand
in the 'Attic/Office'; however, my 5'4" stature seems well suited to the environment.

In summer time, the house air conditioning vent in the 'Attic/Office' is insufficient to keep the room 'comfortable'. Yet, it also doesn't get too hot for the
computers to run. However; I DON'T run all three in the room at once. The two Mac Pros are usually off. The Hackintosh runs 24/7.


TimeLord04


----------



## P Smith (Jul 25, 2002)

Well, then I should try my Sammy UHD TV 55" QLED as a monitor ...


----------



## TimeLord04 (Sep 5, 2021)

P Smith said:


> Well, then I should try my Sammy UHD TV 55" QLED as a monitor ...


3:31 PM - PDT

hmmmm.... Possibly overkill...???... 😳😳😳

I assume you have a laptop that you'd sit comfortably away from the 55" TV at a decent distance???...??? But, yeah, a 4K 55" TV would make one hell of a monitor.


TimeLord04


----------



## P Smith (Jul 25, 2002)

TimeLord04 said:


> 3:31 PM - PDT
> 
> hmmmm.... Possibly overkill...???... 😳😳😳
> 
> ...


I could tell you that after I'll set it up; here is a little problem - my laptop is business class kind, so I need to find VGA-HDMI converter first.


----------



## SledgeHammer (Dec 28, 2007)

TimeLord04 said:


> So, essentially, I owe, now, about $400 on each computer at 0% Interest until April. At which point, I will then pick up the two SSDs I need to complete my 3,1, and by then,
> also have the new 32" BenQ 4K Monitor for the 5,1 so that I can move the ASUS 27" 1080p monitor to dad's system, and move his ASUS 24" 1080p monitor over to my 3,1.


Get out of debt only to start racking up new debt.



TimeLord04 said:


> Wish me luck.


You'll definitely need it .


----------



## harsh (Jun 15, 2003)

There are a few videos on YouTube about the ills of using 4K monitors with Mac OS. Apparently Apple decided that 4K meant that they should double 2K so what you end up with is a 1080p resolution since scaling really takes the stuffing out of the system (even with the Apple silicon)


----------



## kucharsk (Sep 20, 2006)

harsh said:


> There are a few videos on YouTube about the ills of using 4K monitors with Mac OS. Apparently Apple decided that 4K meant that they should double 2K so what you end up with is a 1080p resolution since scaling really takes the stuffing out of the system (even with the Apple silicon)


That's not accurate; my Macs can send true 4K (and higher) to a monitor that can handle it.


----------



## TimeLord04 (Sep 5, 2021)

SledgeHammer said:


> Get out of debt only to start racking up new debt.
> 
> 
> 
> You'll definitely need it .


10-17-2022 at 9:24 AM - PDT

@SledgeHammer 

Actually, once February's payment hits and I'm out of debt, March's $$$ is 'spendable cash', NOT 'debt payments'.
So, I'll have the $$$ for two Samsung 870 EVO or 870 QVO 1TB SSDs to finish my cloning of Snow Leopard and
El Capitan from 'spinning rust' drives over to more stable, faster, and larger SSDs.

Same with the 32" BenQ monitor for the 5,1. I'm actually spending accessible cash, NOT putting debt to credit cards.


TimeLord04


----------



## TimeLord04 (Sep 5, 2021)

kucharsk said:


> That's not accurate; my Macs can send true 4K (and higher) to a monitor that can handle it.


9:31 AM - PDT

@kucharsk 

Absolutely! My Hackintosh is displaying in native 4K at 3840 x 2160. The BenQ 28" 4K EL2870U monitor
is also HDR. So will be the BenQ 32" that I'm saving for. I have $50 already saved this month towards the
32" BenQ.


TimeLord04


----------



## harsh (Jun 15, 2003)

kucharsk said:


> That's not accurate; my Macs can send true 4K (and higher) to a monitor that can handle it.


They can technically send it but the result may be less than desirable (typically the text is too small on a <43" TV). When you ask it to start scaling things, it has a performance impact (according to at least two of the videos that I watched).


----------



## P Smith (Jul 25, 2002)

harsh said:


> They can technically send it





harsh said:


> Apparently Apple decided that 4K *meant that they should double 2K so what you end up with is a 1080p resolution* since scaling really takes the stuffing out of the system (even with the Apple silicon)


Which one is technical post?


----------



## harsh (Jun 15, 2003)

P Smith said:


> Which one is technical post?


Both?


----------



## NYDutch (Dec 28, 2013)

Even Microsoft is onboard with Linux these days... 

*Microsoft Releases Its Linux Distribution Update For October 2022*





__





Microsoft Releases Its Linux Distribution Update For October 2022 - Phoronix







www.phoronix.com


----------



## harsh (Jun 15, 2003)

NYDutch said:


> Even Microsoft is onboard with Linux these days...


I'd rather get my hands on the Google distribution based on Debian Unstable.

The Microsoft product is there because Windows doesn't offer the efficiency and variety of filesystems that modern cloud systems demand.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> I'd rather get my hands on the Google distribution based on Debian Unstable.
> 
> The Microsoft product is there because Windows doesn't offer the efficiency and variety of filesystems that modern cloud systems demand.


I'm working on this project at work where I need to deal with an Ubuntu Linux VM. How you think this is a good OS for day to day work for the average person is beyond me. Between all the security nuances, crazy command lines, etc. this has taken much longer then its had to. We even had all our scripts working on the Mac, which is supposed to be close to Linux, but when we tried to run them on Linux, one issue after another.

Even something stupid like setting an environment variable from a script has to be overly complicated.

There was another thing where we had to pipe in inputs to stdin which worked fine on the Mac, but on Ubuntu we got an "ioctl error". That's another issue, all the error messages are so cryptic and require a bunch of time googling.

And we can't run a UI on that VM because its an "operational" Linux box.

Sure, it's fine as a base for your Docker image for a K8s pod, but I can see why it has only 2% market share lol.

Windows has lost a bit of market share to Mac over the past decade, but not a whole lot. Linux has always been a non-event for the home market.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> I'm working on this project at work where I need to deal with an Ubuntu Linux VM. How you think this is a good OS for day to day work for the average person is beyond me. Between all the security nuances, crazy command lines, etc. this has taken much longer then its had to. We even had all our scripts working on the Mac, which is supposed to be close to Linux, but when we tried to run them on Linux, one issue after another.


I hope I've not given anyone the impression that I'm a fan of Ubuntu (especially recent versions that feature Snap packages). Ubuntu offers desktop versions as a courtesy as their primary area of concentration is servers and similar where they can sell support contracts.


> Even something stupid like setting an environment variable from a script has to be overly complicated.


You speak as if that is substantially easier in a Windows environment. Depending on whether the variable is a system variable or a user variable, that kind of stuff may even require logging out and back in again in Windows!

In most popular Linux (and Mac) shells, it is a simple "export <parameter>=<value>".


> There was another thing where we had to pipe in inputs to stdin which worked fine on the Mac, but on Ubuntu we got an "ioctl error". That's another issue, all the error messages are so cryptic and require a bunch of time googling.


That seems like it may be more of an issue with design metaphors on the platform that the software originated on than with a different operating system. I find new programming languages like Rust and Go to be less obtuse than PowerShell and its long-winded OOP classes.


> And we can't run a UI on that VM because its an "operational" Linux box.


Spinning up another VM takes seconds (unless you're using the WSL). Maybe you could approach it that way.


> Windows has lost a bit of market share to Mac over the past decade, but not a whole lot. Linux has always been a non-event for the home market.


With school-age kids growing up with school-provided Chromebooks, I think that's likely to change pretty quickly. Windows is just too messy to interoperate with given its unique character sets and lack of modern filesystem support. The Mac platform could pick up a lot of that slack if the hardware (and all the requisite adapters -- probably why users choose laptops over desktops in the Mac world) didn't cost so damn much.

For me, the most unfathomable downside of programming on Windows is that Microsoft keeps introducing new programming tools. How long do you suppose it will be before Typescript replaces C#?

Apple has only put their development community through a couple such changes (Pascal to Objective C and then to Swift) and each time, they lost some key developers.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> You speak as if that is substantially easier in a Windows environment.  Depending on whether the variable is a system variable or a user variable, that kind of stuff may even require logging out and back in again in Windows!


Yes, it is. Much.

If I run a batch file that sets an environment variable, and the batch file finishes, the environment variable is set as expected.

If I run a shell script that sets an environment variable, and the shell file finishes, the environment variable is *NOT* set as expected because it ran in a "child shell". To get it to stick in the current shell as a batch file would, you need to run it with the source command. And that's after you ran the chmod +x on it in the first place.



harsh said:


> In most popular Linux (and Mac) shells, it is a simple "export <parameter>=<value>".


Not if you're trying to automate it. See above.



harsh said:


> That seems like it may be more of an issue with design metaphors on the platform that the software originated on than with a different operating system. I find new programming languages like Rust and Go to be less obtuse than PowerShell and its long-winded OOP classes.Spinning up another VM takes seconds (unless you're using the WSL). Maybe you could approach it that way.


It's a VM that was spun up in AWS.



harsh said:


> With school-age kids growing up with school-provided Chromebooks, I think that's likely to change pretty quickly.


Not sure what you're talking about here because ChromeOS has even less market share then Linux.



harsh said:


> Windows is just too messy to interoperate with given its unique character sets and lack of modern filesystem support.


Windows uses UTF-16 which is a standard. Not sure what you're talking about here either. NTFS and even FAT32 is "good enough" for 99.99% of home users.



harsh said:


> The Mac platform could pick up a lot of that slack if the hardware (and all the requisite adapters -- probably why users choose laptops over desktops in the Mac world) didn't cost so damn much.


Most young people choose laptops regardless of the OS.



harsh said:


> For me, the most unfathomable downside of programming on Windows is that Microsoft keeps introducing new programming tools. How long do you suppose it will be before Typescript replaces C#?


As an actual software engineer, I'll quote the recent Twitter employee who got fired and say "you have no clue what you're talking about".

From day one of the IBM PC in 1983, the main programming language was C. First Borland C/C++, then when Microsoft took over it became Visual C/C++ but the language didn't change at all (aside from a few random hobbyists that used Pascal and/or Visual Basic for a short time, but neither of those languages were ever used heavily in professional environments). Around the early 2000's, C# became a thing. 20 yrs later, its still a thing if you're programming in Windows.

Except nobody really programs in Windows anymore except hobbyists and some small fly-by-night companies.

Most reputable tech companies today and using Java for the backend (some Kotlin as well). Facebook uses PHP. Some more recent unicorns like Uber use Go. For frontend, its pretty much Java script and the framework of your choice.



harsh said:


> Apple has only put their development community through a couple such changes (Pascal to Objective C and then to Swift) and each time, they lost some key developers.


If you quit programming because a new language or technology came out, you shouldn't be a programmer.


----------



## NYDutch (Dec 28, 2013)

SledgeHammer said:


> Not sure what you're talking about here because ChromeOS has even less market share then Linux.


You do know that ChomeOS, Android, and hundreds of GNU distributions are all Linux based, don't you?


----------



## SledgeHammer (Dec 28, 2007)

NYDutch said:


> You do know that ChomeOS, Android, and hundreds of GNU distributions are all Linux based, don't you?


You do know that ChromeOS is counted separately, don't you? (not by me, but by the people that count such things). Those same counting people also say its an Android phone and not a Linux phone (since I'll assume you know there are phones that ACTUALLY run main branch Linux while an Android runs a heavily modified Linux kernel).


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> Windows uses UTF-16 which is a standard. Not sure what you're talking about here either. NTFS and even FAT32 is "good enough" for 99.99% of home users.


UTF-16 is mostly a Microsoft problem (shared by Java and Javascript?) and it is supported mostly within NTFS and Microsoft's own file formats (that they've paid to establish as "standards"). Other modern operating systems use UTF-8 across the board that requires as little has half the storage space yet supports up to 4 byte characters where UTF-16 is either two bytes or four. UTF-16 also suffers the question of "little endian" byte ordering (used by both Intel core and ARM cpus).

NTFS is fine for standalone desktop and portable computer use but at some point, everyone is going to need a NAS (especially if they are using laptops) and the best way to do that isn't using Windows.

The fact many Windows applications (including some Microsoft applications) use the Windows character sets rather than UTF-16 makes interoperability even within Windows a fairly a fairly nasty proposition without specialized tools or import/export procedures.

The dependence on file extensions (a holdover from the 1970s) is another bewildering oddity.

Don't get me started on the places where FAT32 doesn't intersect with NTFS.


> From day one of the IBM PC in 1983, the main programming language was C. First Borland C/C++, then when Microsoft took over it became Visual C/C++ but the language didn't change at all (aside from a few random hobbyists that used Pascal and/or Visual Basic for a short time, but neither of those languages were ever used heavily in professional environments).


I go back to practical programming where MBASIC and CBASIC were the order of the day. Turbo C didn't happen for quite a while (1987) and Borland C/C++ came in the '90s. Your misremembering the timeline pretty badly.


> If you quit programming because a new language or technology came out, you shouldn't be a programmer.


If you have mountains of code that use one paradigm and the OS company (Apple, in this case) intimates that thou shalt change thy evil ways without providing the necessary tools to make the conversion, why bother messing with the platform?

When I went to work for realz on DOS platorms (my original work experience was with COBOL on Xenix), Visual Foxpro was the Microsoft recommended database platform and a lot of the code was nearly 100% FoxPro/dBASE/DBXL. Only later did it turn to VisualBASIC and VisualC perhaps in combination with SQL Server (thankfully, Microsoft never recommended Access for serious applications).

All along the way, most Linux distributions offer just about any programming platform you want (without the complication of those hideous DLLs) along with some very competent compilers, linters, linkers and editors.

I've spent the better part of a week fussing with translating between Windows-1251, ASCII (with and without "ANSI"), UTF-16, UTF-8 and old Mac encodings as part of a personal project and I can assure you that it has been no fun at all.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> You do know that ChromeOS is counted separately, don't you?


It is only important for those who are driven by numbers rather than what makes the most sense.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> UTF-16 is mostly a Microsoft problem (shared by Java and Javascript?) and it is supported mostly within NTFS and Microsoft's own file formats (that they've paid to establish as "standards"). Other modern operating systems use UTF-8 across the board that requires as little has half the storage space yet supports up to 4 byte characters where UTF-16 is either two bytes or four. UTF-16 also suffers the question of "little endian" byte ordering (used by both Intel core and ARM cpus).


You seriously have no clue what you're talking about. Windows & Mac & iOS use UTF-16. Also Linux has support UTF-16 for ages.



harsh said:


> everyone is going to need a NAS (especially if they are using laptops) and the best way to do that isn't using Windows.


Lol. No, not every needs or has a NAS.



harsh said:


> The fact many Windows applications (including some Microsoft applications) use the Windows character sets rather than UTF-16 makes interoperability even within Windows a fairly a fairly nasty proposition without specialized tools or import/export procedures.


Again, you have no clue what you're talking about. Windows & C# uses UTF-16.



harsh said:


> The dependence on file extensions (a holdover from the 1970s) is another bewildering oddity.


Lol, so how do you differentiate file types? How are file extensions different from having to chmod +x everything?



harsh said:


> Don't get me started on the places where FAT32 doesn't intersect with NTFS.


Don't worry, I won't since they serve completely different purposes.



harsh said:


> I go back to practical programming where MBASIC and CBASIC were the order of the day. Turbo C didn't happen for quite a while (1987) and Borland C/C++ came in the '90s. Your misremembering the timeline pretty badly.


Here's another example where you seriously have no clue what you're talking about. First off, I specifically mentioned the IBM PC, not some dinosaur mainframe you grew up on with punch cards. Turbo C was a Borland product btw.



harsh said:


> If you have mountains of code that use one paradigm and the OS company (Apple, in this case) intimates that thou shalt change thy evil ways without providing the necessary tools to make the conversion, why bother messing with the platform?


Good thing you're retired. Technology changes every day. You wouldn't last a day in the real world if you whined every time there was a breaking change. That's SOP these days. And if you want to whine about breaking changes, you should have become a Microsoft developer since Microsoft was super stringent about no breaking changes up until .Net went open source. Java people have never cared about backwards compatibility.



harsh said:


> When I went to work for realz on DOS platorms (my original work experience was with COBOL on Xenix), Visual Foxpro was the Microsoft recommended database platform and a lot of the code was nearly 100% FoxPro/dBASE/DBXL.


This explains why you are so out of touch with reality lol.



harsh said:


> All along the way, most Linux distributions offer just about any programming platform you want (without the complication of those hideous DLLs) along with some very competent compilers, linters, linkers and editors.


How is a DLL different from a jar file or a lib file on linux?



harsh said:


> I've spent the better part of a week fussing with translating between Windows-1251, ASCII (with and without "ANSI"), UTF-16, UTF-8 and old Mac encodings as part of a personal project and I can assure you that it has been no fun at all.


That sounds like you don't know what your doing then. Moving between code pages is all automated and mostly transparent to the developer and has been for ages.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> It is only important for those who are driven by numbers rather than what makes the most sense.


Sadly, I live on planet earth where everything is driven by numbers. Won't you come and join us? Well, maybe not lol.


----------



## NYDutch (Dec 28, 2013)

SledgeHammer said:


> You do know that ChromeOS is counted separately, don't you? (not by me, but by the people that count such things). Those same counting people also say its an Android phone and not a Linux phone (since I'll assume you know there are phones that ACTUALLY run main branch Linux while an Android runs a heavily modified Linux kernel).


And the common desktop and server Linux kernel distros are GNU based. What's your point? Being able to modify the open source Linux kernel for your own needs is one of the great features of Linux. Even the Microsoft Linux based OS is easily modified if needed. Try that with a Windows NT kernel based OS.


----------



## SledgeHammer (Dec 28, 2007)

NYDutch said:


> And the common desktop and server Linux kernel distros are GNU based. What's your point? Being able to modify the open source Linux kernel for your own needs is one of the great features of Linux. Even the Microsoft Linux based OS is easily modified if needed. Try that with a Windows NT kernel based OS.


What's yours? You told us like 3 yrs ago Windows was moving to a Linux kernel. What ever happened to that? We're talking home users here. Nobody at home is modifying Linux kernels except you & Harsh.

"Microsoft Linux based OS"? You mean WSL or Azure Linux? Lol. I can assure you nobody that uses Linux uses anything from Microsoft. Linux people hate anything Microsoft. Harsh is a perfect example. Every Mac/Linux Prod company I've ever worked at has had nothing Microsoft except Outlook. Java companies all use AWS.


----------



## NYDutch (Dec 28, 2013)

SledgeHammer said:


> What's yours? You told us like 3 yrs ago Windows was moving to a Linux kernel. What ever happened to that? We're talking home users here. Nobody at home is modifying Linux kernels except you & Harsh.
> 
> "Microsoft Linux based OS"? You mean WSL or Azure Linux? Lol. I can assure you nobody that uses Linux uses anything from Microsoft. Linux people hate anything Microsoft. Harsh is a perfect example. Every Mac/Linux Prod company I've ever worked at has had nothing Microsoft except Outlook. Java companies all use AWS.


I have no reason to think MS is not still working on a Linux kernel for Windows. The MS CBL-Mariner 2.0 Linux distro is currently available on GitHub. It was updated in October...









Release CBL-Mariner 2.0 October 2022 Release · microsoft/CBL-Mariner


Important update in glibc: all of the statically-linked libraries have been moved to a separate glibc-static package. Every package depending on these static binaries will now require to include a ...




github.com









__





Microsoft Releases Its Linux Distribution Update For October 2022 - Phoronix







www.phoronix.com


----------



## SledgeHammer (Dec 28, 2007)

NYDutch said:


> I have no reason to think MS is not still working on a Linux kernel for Windows. The MS CBL-Mariner 2.0 Linux distro is currently available on GitHub. It was updated in October...
> 
> 
> 
> ...


Yeah, its the base Azure Linux container as I mentioned above. Nobody is running that at home or for any other purpose outside of Azure lol. And doubtful anybody would even run that in Azure when they can just run something proven like Ubuntu. But like I said, no Java shop is using Azure anyways. If you're a fly-by-night, random hole in the wall company using C# for production apps, you'd use Azure (maybe), but you wouldn't run them on Linux containers because you're a Windows shop. And .Net classic doesn't even run on Linux because it's tightly coupled to the Windows Kernel.

You can, of course, run .Net Core in a Linux container, but if you're doing that, you're just doing it to try to "be cool" (and failing) without really understanding the point of Linux containers. Go interview at a bunch of well known tech companies to get familiar with what tech stacks people actually use.

Maybe your fantasy Linux based Windows will come out in another 3 yrs? I'll check back.


----------



## NYDutch (Dec 28, 2013)

SledgeHammer said:


> Yeah, its the base Azure Linux container as I mentioned above. Nobody is running that at home or for any other purpose outside of Azure lol. And doubtful anybody would even run that in Azure when they can just run something proven like Ubuntu. But like I said, no Java shop is using Azure anyways. If you're a fly-by-night, random hole in the wall company using C# for production apps, you'd use Azure (maybe), but you wouldn't run them on Linux containers because you're a Windows shop. And .Net classic doesn't even run on Linux because it's tightly coupled to the Windows Kernel.
> 
> You can, of course, run .Net Core in a Linux container, but if you're doing that, you're just doing it to try to "be cool" (and failing) without really understanding the point of Linux containers. Go interview at a bunch of well known tech companies to get familiar with what tech stacks people actually use.
> 
> Maybe your fantasy Linux based Windows will come out in another 3 yrs? I'll check back.


A Linux based Windows is certainly not my "fantasy". As far as I'm concerned MS can keep the fingers out of the Linux pie completely. I've just commented on Linux/MS links that have been reported in the tech press. Fine with me if it never happens, but MS is obviously looking at Linux just as they looked at Chromium and eventually adopted it for Edge. Neither one of us has any idea when or if MS might move ahead with a Linux based product.


----------



## SledgeHammer (Dec 28, 2007)

NYDutch said:


> A Linux based Windows is certainly not my "fantasy". As far as I'm concerned MS can keep the fingers out of the Linux pie completely. I've just commented on Linux/MS links that have been reported in the tech press. Fine with me if it never happens, but MS is obviously looking at Linux just as they looked at Chromium and eventually adopted it for Edge. Neither one of us has any idea when or if MS might move ahead with a Linux based product.


What does Chromium have to do with Linux? Internet Explorer was damaged goods and they had 3 choices. Get out of the browser market, use an existing engine or write their own. Writing their own wouldn't make financial sense. Quite frankly, I don't know why they didn't exit the browser market as Edge is pretty much about as popular as Linux is for home use. Its got 5% market share. Firefox has 8% and Safari has 9%. Chrome has 77%.

Microsoft obviously isn't moving Windows to Linux lol. First it would cost too much and take too long. Second of all, what's the point? 3rd of all, you'd break all sorts of backwards compatibility. Just look at the open source attempt to port the Windows API to Linux. So many apps don't work on it and they've been at it for 30 yrs. And forth of all, for the millionth time, nobody uses Linux at home (2% market share) and nobody is going to want to learn all the cryptic command lines. If Microsoft wasn't willing to re-write a browser engine, what makes you think they'd want to re-write an OS? Internet Explorer was damaged goods. Windows has like 80% market share as it is and NONE of that has gone to Linux.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> You seriously have no clue what you're talking about. Windows & Mac & iOS use UTF-16. Also Linux has support UTF-16 for ages.


Don't confuse "support" with "use" or "default". The default text encoding on the Mac OS is UTF-8 (it used to be CP1252). It makes it nice since UTF-8 encoding ends up being unencoded if you aren't doing anything out of the ordinary (for English speakers anyway). That most certainly can't be said about UTF-16.


> Lol. No, not every needs or has a NAS.


I suppose there are some that can go through life and produce next to nothing worth saving or sharing.


> Lol, so how do you differentiate file types?


Because few of the encodings have identifying headers, it isn't easy. With my code, I just try various approaches and see what happens. Looking for telltale excape sequences is only partially useful.

Date formatting is also difficult and Windows typically uses a deeply buried OS parameter to determine the date format and "good" programs are expected to use it. Unfortunately, that format is determined by locale rather than across the platform. Don't get me started on the epoch date confusion where NTFS uses January 1, 1601 (the most recent start of the Gregorian calendar cycle) but .NET uses January 1, 1, DOS used 1/1/1980 and Excel uses 1/1/1900 (an homage to Lotus 1-2-3). Regular expressions are huge here for both identification and reformatting (to something POSIX).


> Don't worry, I won't since they serve completely different purposes.


They're both actively used Windows file systems with FAT32 arguably being the preferred format for exchange with other platforms using USB drives. Exchange of information is key in the quest for collaboration and understanding.


> Here's another example where you seriously have no clue what you're talking about.


You must have this phrase stuck on your clipboard. It appears a lot in your posts where you're about to spew forth a bilious and/or inaccurate retort.


> First off, I specifically mentioned the IBM PC, not some dinosaur mainframe you grew up on with punch cards.


My second work computer was a Televideo 80286 "server" (complete with IBM PC AT expansion slots) running Xenix (Microsoft's version of *nix in a valiant attempt to be taken seriously in the computing world). It supported terminals (what with Televideo being a popular terminal manufacturer) but was by all accounts a microcomputer. Storage was provided by an ESDI-connected 5.25" full-height hard drive and a DC300 tape drive for backup. It didn't support Hollerith cards or punch tape.


> Turbo C was a Borland product btw.


Duh. My point was that Borland C/C++ wasn't a thing until after Windows came on the scene, not with the introduction of the IBM PC as you asserted.


> This explains why you are so out of touch with reality lol.


Those who fail to learn from history are doomed to repeat it -- Winston Churchill.

Reality changes with time but it is how we approach those changes that determines whether we step forward or just change for the sake of some other company's bottom line. Imagine where we might be if the benchmarks for computing goodness weren't the ability to run Flight Simulator and Lotus 1-2-3 back in the mid 1980s.

I well remember the wide-eyed Turbo Pascal fanboys trying to find work back in the day. They were going to take over the world with their IDEs and speedy executables only to learn that the real world was using something much different (BASIC, FORTRAN, COBOL).


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> What does Chromium have to do with Linux?


The better question is "what does using Chromium say about Microsoft's future?".

What doors does it open and does anyone stand to lose?

Market share is for lemmings.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> Don't confuse "support" with "use" or "default". The default text encoding on the Mac OS is UTF-8 (it used to be CP1252). It makes it nice since UTF-8 encoding ends up being unencoded if you aren't doing anything out of the ordinary (for English speakers anyway). That most certainly can't be said about UTF-16.


Don't confuse "not the default" with "not set up properly".



harsh said:


> I suppose there are some that can go through life and produce next to nothing worth saving or sharing.


You need a NAS to share or save things? News to the 95% of the population that doesn't have a NAS. How do you share your things with the world via a NAS? Ever heard of network shares? Or Github? Anybody who wants to save and/or share any source code is using Github, not your NAS. Since you hate Windows and Microsoft so much, weird that you apparently haven't heard of the Google Docs suite that people using for sharing docs of various kinds. You might have trouble identifying the specific kind though since you apparently hate file extensions too.



harsh said:


> Because few of the encodings have identifying headers, it isn't easy. With my code, I just try various approaches and see what happens. Looking for telltale excape sequences is only partially useful.


UTF-8 is "default" and UTF-16 is automatically identified by the header. Other file types are generally identified by standardized extensions and headers.

"try various approaches and see what happens" isn't a very useful approach to life or software development.



harsh said:


> Date formatting is also difficult and Windows typically uses a deeply buried OS parameter to determine the date format and "good" programs are expected to use it. Unfortunately, that format is determined by locale rather than across the platform. Don't get me started on the epoch date confusion where NTFS uses January 1, 1601 (the most recent start of the Gregorian calendar cycle) but .NET uses January 1, 1, DOS used 1/1/1980 and Excel uses 1/1/1900 (an homage to Lotus 1-2-3).


Wrong again. Date/time formatting / parsing is trivial. Try using a modern library / framework instead of trying to roll your own with regex and you wouldn't be working on it for a week or two as you claim. You'd work on it for a few minutes since its a one liner in most modern frameworks. Most modern applications can also automatically identify the format. Excel surely can. So can C# and Java and Python and any other modern programming languages.



harsh said:


> Regular expressions are huge here for both identification and reformatting (to something POSIX).


This is why you have so many problems and difficulties with your code and in life it seems. Regex is most definitely NOT a solution for everything. You should use the appropriate libraries that take care of all these things for you. Not roll your own. Unless of course your intention is to roll your own for learning purposes and or doing something that hasn't been done a million times already. Date / time parsing & formatting has been done a million times and certainly not through Regex since that wouldn't be able to handle all the edge cases.



harsh said:


> They're both actively used Windows file systems with FAT32 arguably being the preferred format for exchange with other platforms using USB drives. Exchange of information is key in the quest for collaboration and understanding.


File systems are transparent in most cases.



harsh said:


> It appears a lot in your posts where you're about to spew forth a bilious and/or inaccurate retort.


Actually, it is YOU that is regarded as the biggest troll on here. Even the mods have referred to you as a disturbing force.



harsh said:


> My point was that Borland C/C++ wasn't a thing until after Windows came on the scene, not with the introduction of the IBM PC as you asserted.


Where did I say Borland C/C++ came out with the IBM PC? Borland C++ and Turbo C++ were two different products. Although both were Borland.



harsh said:


> Those who fail to learn from history are doomed to repeat it -- Winston Churchill.


So why do you continue to post about things you clearly have no clue about? Myself, MANY other posters and most of the mods have also called you out on that. Again, you've been referred to by the mods as a disturbance to the forums.



harsh said:


> I well remember the wide-eyed Turbo Pascal fanboys trying to find work back in the day.


Pascal was never a thing in industry. Same as Visual Basic was never a thing in industry. Those were languages that were popular with hobbyists / students only.



harsh said:


> to learn that the real world was using something much different


Strange that you don't follow that mantra. You've made so many ridiculous claims in this thread and don't use anything popular or common it seems. Sorry, Linux isn't popular or common for home use.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> The better question is "what does using Chromium say about Microsoft's future?".
> 
> What doors does it open and does anyone stand to lose?


Who said it did? Edge has about the same penetration as Linux.



harsh said:


> Market share is for lemmings.


Weird comment since you're always asking for numbers and stats and that's what most people and companies on planet earth use.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> From day one of the IBM PC in 1983, the main programming language was C. First Borland C/C++, then when Microsoft took over it became Visual C/C++





SledgeHammer said:


> Where did I say Borland C/C++ came out with the IBM PC? Borland C++ and Turbo C++ were two different products. Although both were Borland.


Any questions?


SledgeHammer said:


> Pascal was never a thing in industry. Same as Visual Basic was never a thing in industry. Those were languages that were popular with hobbyists / students only.


Pascal was Apple's official development platform for the Mac (and the only way to access early Apple hard drives) for some time as I stated.

You've just come into the game much to late to have seen the progression and be passing judgement on others based on your limited exposure.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> Any questions?


Yes. Are your reading skills as bad as your coding skills? I said "From day one of the IBM PC in 1983, the main programming language was C" the EXACT line which you quoted. Then there's a period. Then I said Borland was the most popular C/C++ compiler and then Microsoft took over. A period generally is used to identify the end of a sentence. You must have missed that with your regex. If you are talking about the compiler that came with the XT, I believe that was IBM C, but Borland was the one that really took off.



harsh said:


> You've just come into the game much to late to have seen the progression and be passing judgement on others based on your limited exposure.


Wrong again. I've been a software engineer for 30 yrs and using computers even longer then that back to the 80s. I've just evolved with the times. You clearly have not and you're stuck on what you did 20 yrs ago which is largely irrelevant. As a matter of fact, stuff from like 3-5 yrs ago is largely irrelevant today.

You also seem to live in an alternate reality where you don't use a cell phone, but everybody has a NAS and is recompiling Linux kernels at home and transferring files across 17 different systems.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> Then I said Borland was the most popular C/C++ compiler and then Microsoft took over.


You didn't say "then there was Borland C". You literally said "First Borland C/C++." "First" literally means that there was no other before it. There were many development tools used before Borland C (or even Turbo C) came onto the scene. Many, if not most, were not C.

My experience with computers and programming goes back to the late 1970s. I've seen a lot and I've learned a lot by keeping an open mind rather than having Microsoft (or Apple) telling me how things are.


SledgeHammer said:


> As a matter of fact, stuff from like 3-5 yrs ago is largely irrelevant today.


Not irrelevant, just not what Microsoft recommended. Five years ago, Java was the hot ticket (but Microsoft was pushing C# and .NET hard). Today the list of preferred programming languages is topped by two scripting languages (Javascript circa 1995 and Python circa 2000). The fact that Microsoft hired Guido van Rossum should tell you something. Clearly neither is really suitable for writing Windows applications but that may be telling in where the industry is headed: multi-platform versus monolithic Windows desktop applications.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> You didn't say "then there was Borland C". You literally said "First Borland C/C++." "First" literally means that there was no other before it. There were many development tools used before Borland C (or even Turbo C) came onto the scene. Many, if not most, were not C.


First as in mainstream.



harsh said:


> My experience with computers and programming goes back to the late 1970s. I've seen a lot and I've learned a lot by keeping an open mind rather than having Microsoft (or Apple) telling me how things are.


You're the most close minded person I've ever met lol. You even contradicted yourself in that comment.



harsh said:


> Not irrelevant, just not what Microsoft recommended.


People stopped caring about what Microsoft recommends at least a decade or more ago. Outside of Windows & Office, they are largely irrelevant to the average person. They are also largely irrelevant in software development since most companies aren't using Microsoft tech since it doesn't scale well.



harsh said:


> Five years ago, Java was the hot ticket (but Microsoft was pushing C# and .NET hard).


Java has been the hot ticket a lot longer then that and is still the hot ticket for backend work. If you want to work at a tech company, you aren't using C# and .NET since, well, ever.



harsh said:


> Today the list of preferred programming languages is topped by two scripting languages (Javascript circa 1995 and Python circa 2000). The fact that Microsoft hired Guido van Rossum should tell you something. Clearly neither is really suitable for writing Windows applications but that may be telling in where the industry is headed: multi-platform versus monolithic Windows desktop applications.


Thick client development has gone the way of the dodo bird.

Ever heard the expression "right tool for the right job"?

Most popular language for backend work is Java/Spring with newer code bases on Spring Boot and some places are experimenting with Kotlin, which was hot for a few seconds, but doesn't seem to have gained much traction.
Most popular language for ML & Data Science is Python
Most popular language for front-end work is Javascript
I'm not into mobile development, but I believe for iOS its either Objective-C or Swift


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> First as in mainstream.


Not there either. Aztec C and Lattice C were right there near launch along with other name brands like Intel and Watcom. Turbo C came along to save Borland's bacon as Turbo Pascal was losing steam along with CP/M. The fact is that C wasn't all that popular for personal computers (DOS or CP/M) in 1983.


> Java has been the hot ticket a lot longer then that and is still the hot ticket for backend work.


For some, but faster and more reliable compiled languages (i.e. Rust, Go) are making big headway. I find Rust to be ever so slightly less obtuse as compared with Go.


> Ever heard the expression "right tool for the right job"?


Sure. I've also heard the expression "when all you have is a hammer, everything looks like a nail".


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> Not there either. Aztec C and Lattice C were right there near launch along with other name brands like Intel and


You clearly don't understand what the word mainstream means.



harsh said:


> Watcom. Turbo C came along to save Borland's bacon as Turbo Pascal was losing steam along with CP/M. The fact is that C wasn't all that popular for personal computers (DOS or CP/M) in 1983.


The internet would disagree with you as C surpassed Pascal by 1985 and stayed on top until the early 2000s. By comparison, Pascal was only on top for a measly 5 years. Also if you were on Apple, the most popular language at the time was Objective C.



harsh said:


> For some, but faster and more reliable compiled languages (i.e. Rust, Go) are making big headway. I find Rust to be ever so slightly less obtuse as compared with Go.


No, they're not. They're only making a small dent with unicorns. They're currently ranked #9 & #10, while Python is #1 & Java #2 and Javascript #3.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> You clearly don't understand what the word mainstream means.


You clearly don't understand that C wasn't the preferred development tool for the IBM PC in 1983. If C wasn't the preferred tool, it is probably not possible for any C compiler to be considered mainstream.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> You clearly don't understand that C wasn't the preferred development tool for the IBM PC in 1983. If C wasn't the preferred tool, it is probably not possible for any C compiler to be considered mainstream.


You clearly don't understand what the word preferred means either.

Seeing as the XT shipped with a C compiler, you're wrong, as usual. As was your claim that C didn't take off til "well into the 90s" when it was the top language by 1985.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> Seeing as the XT shipped with a C compiler, you're wrong, as usual.


Seeing as the XT didn't make an appearance until two years after the PC, you're bending the timeline yet again.


> As was your claim that C didn't take off til "well into the 90s" when it was the top language by 1985.


My '90s reference was with respect to when Borland C/C++ became available (in response to one of your numerous whoppers).

Just because a computer ships with something doesn't make it desirable or mainstream. DOSSHELL (shipped with MS/PC-DOS 4.0 and one of the few interactive "applications" that shipped with DOS) is a classic example of this. DOSSHELL arguably didn't catch fire until it became Windows Explorer.

The IBM PC and IBM PC XT both came with "IBM BASIC" (a branded version of the then ubiquitous Microsoft BASIC interpreter) in ROM (just like the Atari 8-bits, the Apple ][, the Commodore PET, Vic-20, 64 and others). I see no evidence that either PC-DOS or MS-DOS _ever_ shipped with a C compiler of any kind. There were no compiler options listed for the IBM PC by IBM.

The first IBM PC that I laid hands on was purchased with a FORTRAN compiler as that was what was hot at the time. Others were ordered with COBOL systems.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> My '90s reference was with respect to when Borland C/C++ became available


Wrong again. Borland released a C/C++ compiler in May 1990. Hardly "well into the 90s". IBM released theirs in 1985 which was a rebadged Microsoft compiler which was released in 1983. Say, right around the time C became the top language in the world. All your timelines are off by about a decade.



harsh said:


> Just because a computer ships with something doesn't make it desirable or mainstream. DOSSHELL (shipped with MS/PC-DOS 4.0 and one of the few interactive "applications" that shipped with DOS) is a classic example of this. DOSSHELL arguably didn't catch fire until it became Windows Explorer.


Uh, PC-DOS/MS-DOS were extremely popular. No, not everything that ships with everything is desirable or popular. That's why we have those two words, to describe things that are.



harsh said:


> The IBM PC and IBM PC XT both came with "IBM BASIC" (a branded version of the then ubiquitous Microsoft BASIC interpreter) in ROM (just like the Atari 8-bits, the Apple ][, the Commodore PET, Vic-20, 64 and others). I see no evidence that either PC-DOS or MS-DOS _ever_ shipped with a C compiler of any kind. There were no compiler options listed for the IBM PC by IBM.


Wrong yet again. It came with IBM C.



harsh said:


> The first IBM PC that I laid hands on was purchased with a FORTRAN compiler as that was what was hot at the time. Others were ordered with COBOL systems.


You are again confusing what Harsh thinks is relevant or popular with what the rest of the planet does. I'm yet to see ANYTHING you've mentioned or use or do or think to be relevant, popular or mainstream. You're the exact opposite of all those words.

You don't use a cell phone, you compile linux kernels, you use linux, you don't use Windows, or Office. You use regex to parse dates and files (rather poorly it seems), you think everybody has or needs a Nas, you don't use github, you don't use file extensions, all the languages you've mentioned were largely niche things or only used in business, etc. I could go on and on. Heck, even your choice of dog breed isn't even in the top 25.

Your comment about how Java "was" the hot ticket is probably your most absurd comment yet.

Your hobby of trolling DirecTV forums and becoming an "expert" on a service you've never had is also an indication of your lack of understanding of what the average person does. That activity is most definitely not mainstream.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> Wrong again. Borland released a C/C++ compiler in May 1990. Hardly "well into the 90s". IBM released theirs in 1985 which was a rebadged Microsoft compiler which was released in 1983. Say, right around the time C became the top language in the world. All your timelines are off by about a decade.


How does a C compiler that was released in 1985 (or even 1983) ship with a computer that was released in 1981?

What would have been the point of including C compiler with PC/MS-DOS? My recollection was that C was mostly an academic language on platforms other than Unix in 1981.

You produce a lot of information that isn't backed up with documentation. How much of it is true versus how much is made up or misremembered?


> Uh, PC-DOS/MS-DOS were extremely popular.


They became popular ultimately but not on your timeline. Of course you're surreptitiously morphing the context of the discussion by claiming popularity of the operating system versus the popularity of the tools used to write applications for that platform.


> Wrong yet again. It came with IBM C.


And your proof of this is?

From IBM's own Product Fact Sheet:


IBM said:


> *System Software*
> BASIC Interpreter -- Based on the popular Microsoft Basic and offered in three versions -- cassette, diskette and advanced.
> The cassette level is included in the read-only memory of every system and provides input/output instructions needed to enter and retrieve data. It also supports use of the keyboard, display, light pen and printer and provides a full complement of editing and mathematical functions.
> The diskette and advanced levels are optional. The diskette extension supports the use of diskettes, while adding date, time of day and communications capabilities to the system. The advanced extension enhances the display graphics to include features such as point, circle and get/put display, while increasing light pen and joy stick support for design work and home entertainment.
> ...


Careful readers will note the conspicuous absence of a C compiler.

I can't imagine what it would be like to use a C compiler with a single 160K floppy drive not much RAM. I'm going to assume that a compiler wasn't an option with the cassette tape storage based systems with 16K of RAM.


> You are again confusing what Harsh thinks is relevant or popular with what the rest of the planet does. I'm yet to see ANYTHING you've mentioned or use or do or think to be relevant, popular or mainstream. You're the exact opposite of all those words.


We all have our experiences. I've been designing and writing software since 1977, not just coding.


> You don't use a cell phone, you compile linux kernels, you use linux, you don't use Windows, or Office.


You're making things up and in the style that you seem to favor, you're wrong yet again. I own a wireless smart phone (but only use it occasionally), I don't compile Linux kernels, and I do use Windows for some things (like my income taxes).


> You use regex to parse dates and files (rather poorly it seems),


What would you recommend if the need is to parse dates not covered by POSIX and with the full understanding that the universal Windows date is based on seconds elapsed since January 1, 1601? One of the formats I'm dealing with manifests as this: Mon 16 January 09:54 2012 UTC. You have to recognize it before you can parse it (and you can't be serious about piping everything through Excel to parse the dates for you).

Windows itself produces dates in dozens of different formats (based on both the chosen locale and settings within that locale).


> you think everybody has or needs a Nas,


Need may be too strong. Everyone who has more than a couple of computers or devices connected by a LAN could benefit from a NAS.


> you don't use github,


I do use git for some things but I have my own git server. On the scale that I'm currently developing and at this stage, collaboration and versioning aren't major concerns. I happily use code and information hosted by github.


> ... and all the languages you've mentioned were largely niche things or only used in business, etc.


So you're insisting that business computing is irrelevant and it has no place on IBM compatibles. What do the projects that you participate in create code for?


> I could go on and on.


The issue is neither your endurance nor your willingness to look the fool. The question is will you ever impress anyone with your inaccurate "facts" and generously sprinkled personal attacks.


> Your comment about how Java "was" the hot ticket is probably your most absurd comment yet.


Show me a site that tracks such things that doesn't show Java on the wane and we'll talk. Stack Exchange puts Java at #7 and PYPL puts it at #2 but they don't offer much the way of context.

Berkeley lists a top eleven most "in demand" programming languages and it places Java at #5 behind (in order) Javascript, Python, HTML and CSS. They include C# but omit C and C++ in favor of Rust, Perl and Go. Of course those who are writing applications or gaming software probably aren't going to be using scripting languages but "in demand" surely comes down to what pays the bills.

BTW, the AKC lists the top 197 breeds and the Basset Hound comes in at #34. Do you (or would you) own a Lab simply because that breed is #1 on the definitive list of top breeds?


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> How does a C compiler that was released in 1985 (or even 1983) ship with a computer that was released in 1981?
> 
> What would have been the point of including C compiler with PC/MS-DOS? My recollection was that C was mostly an academic language on platforms other than Unix in 1981.


Careful readers (not you apparently) would note that nobody ever mentioned 1981 except you. I mentioned 1983. But nice (poor) attempt on manipulating the facts.



harsh said:


> You produce a lot of information that isn't backed up with documentation. How much of it is true versus how much is made up or misremembered?


I think you're referring to yourself here. All my comments are well documented on the internet. Ever heard of it?



harsh said:


> They became popular ultimately but not on your timeline. Of course you're surreptitiously morphing the context of the discussion by claiming popularity of the operating system versus the popularity of the tools used to write applications for that platform.


I've already proven multiple times that your time line is off by a decade. Whether you accept reality isn't my concern. I know it interferes with your trolling though.



harsh said:


> I can't imagine what it would be like to use a C compiler with a single 160K floppy drive not much RAM. I'm going to assume that a compiler wasn't an option with the cassette tape storage based systems with 16K of RAM.


Wrong again, they had dual 360kb floppies. Cassette tape? Now you're just making stuff up to just troll. PCs had floppys and later hard drive. You're confusing history with your VIC-20.

16K of RAM? You need to stop smoking weed or whatever other drug you're using. The XT had 640KB which was plenty to run a C compiler.



harsh said:


> I've been designing and writing software since 1977, not just coding.


Total BS. You're not a software engineer and never have been. You're a desktop support guy / hobbyist / hack. Software engineers don't use regex to parse dates & files. Never have, never will. There are established libraries and formats for all of this.



harsh said:


> I own a wireless smart phone (but only use it occasionally)


You have a "_wireless_ smart phone"??? LOL... as opposed to a non wireless one? The rest of us on planet earth use them exclusively. You're confusing things with your landline.



harsh said:


> What would you recommend if the need is to parse dates not covered by POSIX and with the full understanding that the universal Windows date is based on seconds elapsed since January 1, 1601? One of the formats I'm dealing with manifests as this: Mon 16 January 09:54 2012 UTC. You have to recognize it before you can parse it (and you can't be serious about piping everything through Excel to parse the dates for you).


More evidence you're not a software engineer or coder, or even a hack. That's a standard format recognized by any modern library.



harsh said:


> Windows itself produces dates in dozens of different formats (based on both the chosen locale and settings within that locale).


Wrong again. You're confusing storage / display. Again, you're not a software engineer.



harsh said:


> Need may be too strong. Everyone who has more than a couple of computers or devices connected by a LAN could benefit from a NAS.


Really? tell us what you store on yours.



harsh said:


> I do use git for some things but I have my own git server. On the scale that I'm currently developing and at this stage, collaboration and versioning aren't major concerns.


Yet more evidence you're not a software engineer. Git and Github are 2 completely different and independent things. Weird since you claim you have so much to share with the world that you aren't sharing it in the standard way, but then again, you don't know the diff between git & github, so that explains things.



harsh said:


> I happily use code and information hosted by github.So you're insisting that business computing is irrelevant and it has no place on IBM compatibles. What do the projects that you participate in create code for?


Are you talking to me? I'm a lead software engineer at a large well known company. I write code all day and get paid extremely well for it. You live in a remote mountain cabin trolling on forums all day.



harsh said:


> The issue is neither your endurance nor your willingness to look the fool. The question is will you ever impress anyone with your inaccurate "facts" and generously sprinkled personal attacks


You're again confusing me with your poor trolling efforts.



harsh said:


> BTW, the AKC lists the top 197 breeds and the Basset Hound comes in at #34. Do you (or would you) own a Lab simply because that breed is #1 on the definitive list of top breeds?


I said a BH isn't in the top 25 and you just admitted that was true. Finally we're getting somewhere.

Your brain functions in a rather peculiar way. Yes, way more people buy labs vs. bh's. That's why labs are #1 and bhs are #34. Do you know basic math? 1 > 34 when it comes to ranking.

If you think Java is on the way out, you are even more clueless then most people on the forums think you are. And you obviously aren't a software engineer since every back end job is Java / Spring.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> Careful readers (not you apparently) would note that nobody ever mentioned 1981 except you. I mentioned 1983. But nice (poor) attempt on manipulating the facts.


1983 was an erroneous claim on your part. The IBM PC was introduced in 1981, not 1983 as your post #49 unequivocally proclaims.


> All my comments are well documented on the internet.


Documented perhaps but rarely, if ever, cited in the appropriate context. There is a whole lot of misinformation floating around on the Internet. You've demonstrated your ability (and apparently unbounded willingness) to contribute to the misinformation.


> Wrong again, they had dual 360kb floppies.


They could, but they didn't have to (implied by the word "had"). The floppy capacity, as clearly stated under the heading of "Diskette Drive", was 160K and the drive count was "up to" (to use a popular DIRECTV squishy term) two drives. The double-sided drives (as well as support for nine sectors instead of the eight) came later.


> Cassette tape? Now you're just making stuff up to just troll. PCs had floppys and later hard drive.


Again, read the official documentation.


> You're confusing history with your VIC-20.


I've never had a Vic-20. How can anyone make so many incorrect guesses in a single thread?


> 16K of RAM?...The XT had 640KB which was plenty to run a C compiler.


The PC XT (model 5160) wasn't the "day one" PC (model 5150). The original 5150 base model had 16K and a tape drive jack. Whether or not many (or even just one person) ordered that configuration is immaterial as you're summarily denying their existence. Here's a run-down of the standard features (from the aforementioned 5150 document -- highlights mine):


IBM said:


> -Keyboard for data and text entry
> -Cassette player jack for cassette attachment
> -Five expansion slots for additional memory anddisplay, printer, communications and game adaptors
> -Built-in speaker for musical programming
> ...


With respect to the IBM PC XT, the base RAM complement was 256K, not 640K. Only the two most expensive configurations came with 640K of RAM. Ram chips sold for around $4.50/Kilobyte back in 1981 so "bulking up" wasn't a no-brainer unless you're 1-2-3 spreadsheet was busting at the seams.


https://jcmit.net/memoryprice.htm


For anyone interested in the facts, here's IBMs take with respect to the IBM PC XT:




__





AP - IBM Personal Computer XT







www.ibm.com






> Git and Github are 2 completely different and independent things.


I use git exclusively to "clone" stuff from github. It meets my cloning needs quite well and is effortlessly scripted (or cut and pasted into a terminal window as is most commonly my case).


> I said a BH isn't in the top 25 and you just admitted that was true. Finally we're getting somewhere


Since this is one of the few things you've claimed that was actually true, you should consider it partial restitution for your many false claims. It doesn't change the fact that one shouldn't base their choice of dog breed (or operating system) entirely on what the surveys say. There must be some reason that the other 196 breeds on the list continue to be actively cultivated in addition to the breeds that aren't listed (or have been formally de-listed). From a practical pet standpoint, nobody should categorically dismiss mixed breed dogs either.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> 1983 was an erroneous claim on your part. The IBM PC was introduced in 1981, not 1983 as your post #49 unequivocally proclaims


It unequivocally proclaims that you lack reading comprehension and are desperately trying to troll. "From day 1 in 1983" doesn't imply the XT came out in 1983, it implies I am speaking of the year 1983. You lose AGAIN. Hilarious.



harsh said:


> .Documented perhaps but rarely, if ever, cited in the appropriate context. There is a whole lot of misinformation floating around on the Internet.


As I unequivocally proclaimed above, whether you accept fact or are a "fact denier" is none of my concern. And you have been REPEATEDLY admonished by the mods and 99% of the users for that.



harsh said:


> They could, but they didn't have to


That just unequivocally proclaims that you bought a stripped down base model and I didn't. Mine had dual 360kb, full height 5 1/4" floppies and came with IBM C. Or maybe you bought yours in 1981 while I bought mine in 1983 which is the year I've talked about?



harsh said:


> The PC XT (model 5160) wasn't the "day one" PC (model 5150). The original 5150 base model had 16K and a tape drive jack.


There you go again living in your own fantasy world and denying reality. I have unequivocally talked from the beginning of the thread of 1983.



harsh said:


> Only the two most expensive configurations came with 640K of RAM.


WOW. I got you to admit TWO things in a single thread. You're even bad at trolling. A higher quality troll would unequivocally never admit anything.



harsh said:


> Since this is one of the few things you've claimed that was actually true


Now I got you to admit THREE things in a single thread. Now we're getting somewhere.



harsh said:


> It doesn't change the fact that one shouldn't base their choice of dog breed (or operating system) entirely on what the surveys say. There must be some reason that the other 196 breeds on the list continue to be actively cultivated in addition to the breeds that aren't listed (or have been formally de-listed). From a practical pet standpoint, nobody should categorically dismiss mixed breed dogs either.


You're a survey and stat denier. That's great. Who cares? While popular things are not GUARANTEED to be the best and the best things are not GUARANTEED to be popular, they are generally highly correlated.

Your OS of choice has a 2% market share, and your dog of choice also has a 2% market share.

My OS of choice has a 77% market share, and my dog of choice has a 45% market share. Both are generally regarded as better then your choices and the stats and market share back that up.

Sounds like my IBM XT was also better then yours since you were an early adopter and bought into the half baked version. Sux 2 b u.

But its facetious of you to unequivocally proclaim that everyone must have bought the same crappy configured machine that you did.

Much like its facetious of you to unequivocally proclaim that you "design" software when you clearly are a home hobbyist that doesn't know what he's doing. Nobody does it the way you do. Parsing dates is a one liner in every modern language. Using Regex for this unequivocally exposes you as an amateur.

Even if you wanted to roll your own date time parser, you still wouldn't do it that way. But doubtful anybody in 2022 would bother rolling their own since its built in to 100% of languages.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> "From day 1 in 1983" doesn't imply the XT came out in 1983, it implies I am speaking of the year 1983.


If only that is what you had said. Since you can't be bothered to follow links or look back, I'll quote what you said (for a third time):


SledgeHammer said:


> From day one of the IBM PC in 1983, the main programming language was C.


"day one of the IBM PC" wasn't 1983. The IBM PC that had its "day one" in 1981. One clause your statement is clearly false and all we have is your word on the second. Maybe you can share some evidence of your claim that "the main programming language was C" for DOS in 1983 as you haven't yet bloodied your nose on that. I get the distinct feeling that a lot of FORTRAN, assembly and even Pascal/Turbo Pascal programmers aren't going to agree but if you can find authoritative documentation you might salvage on that.


> As I unequivocally proclaimed above, whether you accept fact or are a "fact denier" is none of my concern.


What should concern you is that you crafted a statement that isn't true (for one or possibly more reasons) and no amount of misrepresenting what you said, heaping on insults or trying to campaign for the support of others is going to change that. It doesn't present well to accuse me of being a "denier" when you're denying that you said what you said.

The fact is that the IBM PC XT did indeed debut in 1983 so I'm surprised you're trying to walk that back.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> If only that is what you had said. Since you can't be bothered to follow links or look back, I'll quote what you said (for a third time):


Actually, I said "From day one of the IBM PC in 1983, the main programming language was C.". I'm sorry you fell on your head and you can't comprehend the written word. From the very first post, I have been talking about 1983.

Day one of anything in 1983 is January 1st, 1983 if you want to be specific.

I can see why you're having such trouble parsing dates, you don't understand how they work at all!! Add another item to the ever growing list of stuff Harsh doesn't understand.


----------



## SledgeHammer (Dec 28, 2007)

harsh said:


> The fact is that the IBM PC XT did indeed debut in 1983 so I'm surprised you're trying to walk that back.


HAHAHAHA!!!!! worst troll attempt by Harsh EVER... you screwed up big time!!!!!!! I even screen capped so you can't go back and lie about editing it.

Yes, I agree, the IBM XT did INDEED debut in 1983. That's what I've been saying all along!!! Glad you finally stepped up and admitted you've been lying all this time.


----------



## harsh (Jun 15, 2003)

SledgeHammer said:


> Day one of anything in 1983 is January 1st, 1983 if you want to be specific.


The first argument of the sentence is "Day one of the IBM PC" rather than day one of 1983. Regardless of what year you contemplate its release, "Day one of the IBM PC" will always be August 12, 1981.

Considering your birthday in 1983 (or 2023) doesn't change the date you were born. It only determines the day that you celebrated your birthday that year. Unless you were born on New Year's Day, that date won't be January 1st.


----------



## harsh (Jun 15, 2003)

[double posted]


----------



## MysteryMan (May 17, 2010)

SledgeHammer said:


> Harsh doesn't understand.


I suggest the thread be renamed "Harsh Doesn't Understand".


----------



## SledgeHammer (Dec 28, 2007)

MysteryMan said:


> I suggest the thread be renamed "Harsh Doesn't Understand".


LOL... if we started a thread about that, we'd be on page 5,000 by now.

On the bright side, at least he finally admitted lying about it being 1983.


----------



## James Long (Apr 17, 2003)

"Never argue with a fool; onlookers may not be able to tell the difference."


----------

