6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Sun Oct 06, 2024 5:25 pm

All times are UTC




Post new topic Reply to topic  [ 40 posts ]  Go to page Previous  1, 2, 3
Author Message
PostPosted: Mon Aug 15, 2016 8:57 pm 
Offline

Joined: Sat Dec 13, 2003 3:37 pm
Posts: 1004
As an end user consumer, most users don't really care at all what internal architecture there computer uses. Especially in todays day and age where computers are "fast enough", people will care less and less about internal architecture.

I have no idea what CPUs are running inside of my Jeep, as an example. Or my refrigerator. Or my stove, or TV, or thermostat. Boxes with lights and buttons that I hope simply continue to work.

The Windows eco-system is pretty much "hard coded" to the x86 architecture. For good and ill, they're pretty much stuck with it. For a short while, Windows NT was dabbling with other processors families (notably the Alpha), but that didn't really last.

Apple is less so. Being based on NeXTStep, Mac OS has been multi-architecture to its core since the mid-90's. If Apple decided to, it could readily come out with a new CPU platform, such as an A9 (One of Apples recent ARM based CPUs) based Laptop. Does the existing software work on it? No, it does not. But for most software, it's a re-compile away (plus testing, naturally), and the platform readily handles "fat" binaries that support multiple architectures, so distribution is basically painless for vendors. Apple already promotes "architecture free" development practices throughout it's tool stack. If you're following Apple's guidelines, the port process should be straight forward as a developer.

The Linux and BSDs are cross architecture today already, much software has been ported to several CPUs. The only thing really keeping the OSS systems from running alternative architectures is readily available hardware, and then drivers on top of that. GPU acceleration always being the bug-a-boo plaguing open systems.

Android is mostly architecture agnostic, since the primary development environment is VM based. Sure some code out there is written to the underlying CPU, but the bulk of it is not. Android is not really on the desktop per se yet. But Chrome OS on Chromebooks will be running Android apps very soon.

Finally, with more and more development being done at the higher level, using high level architectures like the Android VM or environments such as Javascript, plus the litany of other scripting languages, the dependency on the CPU is getting less and less and less. Most server side developers already "don't care" about what CPU they're using.

The large internet companies would switch architectures in a heartbeat if there was a discernible benefit to doing so. If Facebook felt that ARM was going to be providing a 10% boost in energy efficiency over the long term compared to Intel, they'd get boards designed, built, and would start porting their back end over. (I'm not suggesting ARM can do that.)

The large AMD ARM servers seem to be pointing to the energy conscious market. The raw truth is that for a large amount of use cases, raw performance is not necessarily the first consideration in decision making, and in those scenarios, perhaps a 1000 ARM cores in a server is more useful than some monster Intel chips.

But this is why you want multiple architectures to be viable in the market, especially the consumer market. You need diversity to avoid a monoculture. You need diversity to encourage competition. To hold Intel's feet to the fire. Intel needs that for it's own sake.

It would be great to see some more CPUs in the mobile space. Something besides ARM. Especially since Android can in theory support this already, let the foundries and designers work their magic to make a better mobile CPU. Apple brought it in house simply so they could scale their production better, get exactly what they wanted for their use cases. But in the end, they're still ARM chips inside.


Top
 Profile  
Reply with quote  
PostPosted: Mon Aug 15, 2016 9:00 pm 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10949
Location: England
Keep an eye on RISC-V - it's the next big thing!


Top
 Profile  
Reply with quote  
PostPosted: Mon Aug 15, 2016 9:19 pm 
Offline

Joined: Tue Nov 10, 2015 5:46 am
Posts: 228
Location: Kent, UK
Bregalad wrote:
ARM also have a "thumb" mode where instructions are 16-bit making code much smaller.
Yes, Thumb definitely helps, but you still can't escape the LOAD/STORE RISC model. An increment of a memory variable, pointed to by a register takes three RISC instructions (LOAD/ADD/STORE) vs. one CISC instruction (ADD with memory). In Thumb that's 6 bytes vs. x86's 3 bytes. You'll get a clearer picture of the difference if you compile a large 'C' file with a cross compiler and different compiler options.

Quote:
On the Game Boy Advance platform, it's common for games to be written ~98% in Thumb and only tight optimized programs to be written in ARM (since the ROM has a 16-bit port anyway, executing ARM code needs 2 cycles per instructions, which is uneffective - so only code loaded in RAM is potentially worth being written in ARM).
I know it well. Back around 2003 I programmed a few little games on the GBA (with help from gbadev.org). It's how I taught myself ARM assembly. I loved it... it was a fun little device.

Quote:
I love the old Windows, but I hate the new ones.
I'm of the polar opposite opinion - I think Windows is getting better with every generation. There's a lot of hate for Windows, and Microsoft in general. From the old guys who'll just never give Microsoft a break for killing Netscape, to Apple fans who conveniently ignore all of Apple's bad behavior and cling on to the "Think Different" motto from the 80s... According to a NYT article from 2013, the median age of Microsoft employees is 34. Most employees were not involved in, nor remember, the things that people hold the grudge about. The grudge is old and stale.

I had a poor opinion of PCs in general throughout the 80s, where machines like the Atari ST and Commodore Amiga beat the pants off of any PC. But then things got better. And better. By the early 90s it was foolish to ignore what Microsoft and Intel were doing. I went through my "Microsoft is evil" phase. Now I've got work to do (and games to play).

Quote:
Most modern software I use are GNU programs that I stay up to date, and I can have the exact same on Linux. The only reason I still use Windows is that I have a large pool of extremely old software that I cannot or do not want to part with.
That's the key difference between you and I... and where I see your loathing for modern Windows makes complete sense. If you're running old software and need solid 16-bit support, then it sounds like Microsoft is letting you down. As a company famous for maintaining backwards compatibility, for practical customer application reasons, with older releases (to the detriment of the advancement of its own platform), it's a shame (if not inevitable) that they'd eventually let 16-bit go. I actually wasn't aware that 16-bit didn't work any more.

Quote:
I will have no reason to use Windows any more, especially not with ****ty graphical interfaces and MS spying issues.
I use google and (to a lesser extent) Facebook. If they want to collect data to better target ads to me then good on them. Why _wouldn't_ I want ads tailored to my preferences, rather than seeing ads for cat litter (I don't have a cat)...? As for Microsoft "spying"... meh... the telemetry and metadata is either used for advertising (like everyone else's) or engineering feedback. I'm one record in a database of billions. Nobody in Redmond gives a damn about my porn preferences (MILF porn, for the curious).

Quote:
Quote:
Other product groups historically used other architectures, such as PowerPC, [...] but have in recent years transitioned over to x86... which now hit the size, power and thermal targets whilst delivering much more horsepower and, to cap it all, bring along the x86 ecosystem.

Sounds very similar to Apple.
There's no rebellion or betrayal here... the engineering groups are choosing the best performance per watt and per dollar, and the architecture with the better road map. Nobody sits in engineering meetings, twiddling their pointy mustache, thinking up ways to stick it to Motorola (or Freescale, or whoever ships PowerPC nowadays).


Top
 Profile  
Reply with quote  
PostPosted: Tue Aug 16, 2016 6:01 am 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8414
Location: Midwestern USA
BigEd wrote:
BDD develops microprocessor systems and does other engineering consulting. Probably he's not running Steam or Photoshop.

I haven't a clue what Steam is (unless we're talking about locomotives or ships). I do fire up Photoshop once in a great while, but mostly I'm running software that helps me get system work done. Although I am no fan of MS Windows, I do have two PCs around here running Windows XP, since no one seems to have ported good CAD software to Linux. The money mostly gets made on the Linux side of the fence and cute little zoogies floating around on a hi-res screen tend to get in the way of getting the job done. :D However, I do fire up KDE now and then, run Firefox (as I am doing now) and pretend that Linux is Windows. :lol: The graphics work great on my dual Opteron box (two quad core MPUs with a bunch of RAM), Firefox is faster than a speeding bullet, and nothing ever crashes or gets hung up.

As far as getting away from the x86 machinery, that's tough to do in the business world. Everyone is riding the x86 bus and no one else seems to be building a bus that is just as powerful and cheap, and can carry as many passengers. Most of us have more computing power sitting by our desks than my employer in the 1980s had in their S370 mainframe. So x86 it is, even though I would love to use POC for something other than a desk ornament. :lol:

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Tue Aug 16, 2016 7:45 am 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10949
Location: England
For my own part, I spend the bulk of my computer usage in a browser - and that needs to be a modern browser with fast JavaScript, which restricts me to the mainstream and with plenty of RAM. The other large application I sometimes run is the Xilinx toolchain, which means x86 again (or, theoretically, an emulation of that!)

As it happens though, the three main machines in the house, including two on desks, are all laptops. I've never really had need of the desktop form factor, not having multiple hard drives or fancy GPU cards. Oh, except one desktop was useful because it had a parallel port.


Top
 Profile  
Reply with quote  
PostPosted: Wed Aug 24, 2016 8:26 am 
Offline
User avatar

Joined: Sat Dec 07, 2013 4:32 pm
Posts: 246
Location: The Kettle Moraine
I use an Amiga 2000 for everyday use. I do have a Pi running a VNC server for 128-bit encrypted websites, though.

But most of what I use it for runs natively on the 68060.

Sooner or later I'll put my A3000UX back together and I won't need the Pi. (Well not for that anyway...)


Top
 Profile  
Reply with quote  
PostPosted: Wed Sep 07, 2016 1:26 am 
Offline

Joined: Sat Jan 04, 2003 10:03 pm
Posts: 1706
whartung wrote:
The Windows eco-system is pretty much "hard coded" to the x86 architecture. For good and ill, they're pretty much stuck with it. For a short while, Windows NT was dabbling with other processors families (notably the Alpha), but that didn't really last.

Apple is less so. Being based on NeXTStep, Mac OS has been multi-architecture to its core since the mid-90's.


Windows NT, from 3.1 all the way up to Windows 10, rides the same basic kernel architecture, and is every bit as multi-architecture as NeXTStep, MacOS, and even Linux. Microsoft has demonstrated this effortlessly, with both ARM, PowerPC, and x86 versions of their operating system platform (what do you think runs in the X-Box series of game systems? Yep, Windows NT. That includes their PowerPC hardware X-Boxes).

Believe it or not, folks have thought that MS ditched support for microkernel OS architecture years ago, but their recent demonstration of running Bash natively on Windows 10 clearly shows they have not. NT is still just as capable of running OS/2, Windows 16-bit, and other platforms from ages ago. The only thing that MS did was remove the unused components from the distribution. But the APIs are still there, and are still used to run Win32 and/or Win64 and WinRT themselves.


Top
 Profile  
Reply with quote  
PostPosted: Wed Sep 07, 2016 1:29 am 
Offline

Joined: Sat Jan 04, 2003 10:03 pm
Posts: 1706
As far as non-X86 platforms for desktop computing, the best bet is the AmigaOne series. They'll come equipped with AmigaOS 4.x of some flavor, but the hardware is usually quickly dual-booted into a Linux environment as well. They use PowerPC motherboards.


Top
 Profile  
Reply with quote  
PostPosted: Fri Sep 09, 2016 9:34 pm 
Offline

Joined: Sat Dec 13, 2003 3:37 pm
Posts: 1004
kc5tja wrote:
Windows NT, from 3.1 all the way up to Windows 10, rides the same basic kernel architecture, and is every bit as multi-architecture as NeXTStep, MacOS, and even Linux. Microsoft has demonstrated this effortlessly, with both ARM, PowerPC, and x86 versions of their operating system platform (what do you think runs in the X-Box series of game systems? Yep, Windows NT. That includes their PowerPC hardware X-Boxes).


They've ported the kernel. I never said anything about the kernel.

I'm talking about the eco-system, the development environment, and the underlying philosophies of the OS and, by projection, the community.

NT has no "fat" binary structure. It's conceptually not there. NeXTStep (and now MacOS) has had this since the early 90's. It still has it, they still use it. It's core to the architecture of the system. Philosophically, Apple promotes developing to this kind of environment. Developers seamlessly ported to the ARM environments of the portable devices, and the build chain supports this as well.

If Apple were to introduce an ARM based Macbook, the support for that platform would grow very quickly, as most developers would need to simply click a check box and rebuild their code with little actual interruption to their normal workflow. And once done, all of their supply chain would Just Work. However they were shipping their programs before, they would still use. They wouldn't have to flag their software in any special way, save some marketing to let customer know that they support the new architecture. Consumers would have no problems with installing the software, downloading the wrong version, for the wrong architecture, etc. (assuming they get "the latest"). Developers get all of that "for free".

I have PowerPC apps on my machine at home -- they don't work. They have a nice little circle/slash icon over them because I can't run them. Again, the OS and environ solve this problem, and have for some time.

Support for the Alpha and other NT architectures were slim to none. Most folks code wouldn't "just port" easily to the new architectures, the code wasn't written with it in mind. Objective-C is written at a higher level than normal C or C++. Part of that mandated that coders adopt different idioms during development that fosters portability. The elements existed on Window, for sure, but it was very easy to simply not follow them, there was no repercussion. In the Obj-C environ, you pretty much were forced to use them -- so portability came "for free".

So, this is why Apple can remain more agnostic. Apple can offer different ARM builds on their phones, and they can have the development tools target different processors if they like. You even upload a fat binary to the app store, and the app store could deliver the properly trimmed binary to the device (dunno if they do do this, but they trivially can), again transparent to the consumer and the developer.

Apple is much farther along in this space than other.

Android punts on it all by targeting the VM (which is clearly another technique).


Top
 Profile  
Reply with quote  
PostPosted: Mon Sep 12, 2016 4:29 am 
Offline

Joined: Sat Jan 04, 2003 10:03 pm
Posts: 1706
whartung wrote:
They've ported the kernel. I never said anything about the kernel.

I'm talking about the eco-system, the development environment, and the underlying philosophies of the OS and, by projection, the community.


I'm just going to agree to disagree with you, but I will conclude with this: the underlying philosophies of the OS are very much an export of the NT ecosystem, regardless of hardware port in question. You actually have no choice but to expose the development philosophy; it's the kernel, and that pervades everything. One need look no further than the relative philosophies of Linux and BSD maintainers to see this in action in a visceral way.

NT has no fat binary structure; you're right. That's because it doesn't need one. But then again, neither does Linux, which is undeniably the most ported operating system on the planet to date, running on everything from "wristwatches to supercomputers".

Quote:
Developers seamlessly ported to the ARM environments of the portable devices, and the build chain supports this as well.


Going to just nod my head in silent "OK" on this. I know a few iPhone developers personally, and they have a very different opinion that you do about the ecosystem, how easy things are, etc.

Quote:
...as most developers would need to simply click a check box and rebuild their code with little actual interruption to their normal workflow.


IDEs are wonderful things, aren't they? Of course, I can do the same with most open-source products sans an IDE or philosophically compatible runtime environment with a simple:

Code:
$ MARCH=doohickey make blah


The platform retargetability is an intrinsic feature of GNU (and, for compatibility, LLVM) toolchain in particular. Remember, X-Code is not anything special; it sits almost entirely on top of GNU and/or LLVM tools.

Quote:
They wouldn't have to flag their software in any special way,


This is definitely not true; the last time I used X-Code, there were multiple checkboxes I had to configure correctly to get a fat binary that actually worked. That X-Code automates this process most times is quite nice; but that configuration has to be there all-the-same. Thankfully, this is a thing of the past for me, and I suspect for a lot of other people, because the idea simply hasn't taken off at all. And not for lack of trying. Besides NeXTStep, the Oberon community had a similar, and vastly superior, technology called "Slim Binaries" (get it?), which was in many respects the intellectual forefather of both WebAssembly and LLVM. In the POSIX arena, a technology called ANDF was the hero of the day. Etc. Etc.

Quote:
Developers get all of that "for free".


You get what you pay for. Nothing is truly free. Nothing.

Quote:
Objective-C is written at a higher level than normal C or C++.


Please illustrate with an example. On the surface, this argument is incredulous, seeing as how Objective-C works at a fundamentally lower level of abstraction than C++ (no templates, weak type-checking fixed only a decade later, and only by making massive changes to the ObjC compiler semantics ultimately leading to them abandoning Obj-C in favor of Swift).

Quote:
Part of that mandated that coders adopt different idioms during development that fosters portability.


This makes no sense, and seems irrelevant. So far as my experience is concerned, things considered good practice in MacOS X are pretty much the same things in Windows and Linux too, MacOS X User Interface Guidelines notwithstanding. Always check your result codes for errors, always close or free resources that you've opened or acquired. Release your acquisitions in the opposite order they're acquired, etc. The community ultimately decides the UI guidelines in use, not the kernel or APIs per se. This can be seen by studying the KDE vs Gnome communities, and even in the Gnome world, Ubuntu-vs-non-Ubuntu sub-communities.

Maybe you're talking about something else, more in the middle of the software stack? Or, maybe it's just an artifact of the app approval process for iPhone and MacApp stores. But, I'd argue this is a social incentive, not a technical incentive.


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 40 posts ]  Go to page Previous  1, 2, 3

All times are UTC


Who is online

Users browsing this forum: Google [Bot] and 21 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: