>analog >hud
fuck off moron. This is the 80s we're talking about. Planes like the F-16 and F-18 literally can't fly without a digital computer controlling them.
It could be just a mechanical instrument projected onto a screen, easy to do.
I mostly know about how physically limited the 2600 was from the Halo port (yes its real). Master Chief's visor in that was a pong ball because that was the only way they could add a second color to the sprite. The reason it worked is because the pong ball had uts own dedicated chip, meaning it didn't have to share colors with the rest of the system.
2600 was made to display sprites. It had limitations because it was made for one thing, sprites on TV.
Because it was all VERY highly specialized computer hardware. You had chips that are designed JUST to run this HUD and basically only ever do that. These are not general purpose computers and can't just be tasked to do anything significantly different.
All that stuff is relatively simple to do, its just math and vector graphics, all you need is an 8 bit processor and some assembly code. Hell, you probably could do that on Atari (although not the 2600, it'd have to be one of their home computers or later consoles) Pic related is a screenshot from the game "Tempest" released by Atari in 1981.
If you want to know why tech today takes so much more effort to do the same things it did back in the 80s go ask PrepHole. They'll give you a novel length rant about lazy programmers and pajeets which is mostly accurate to the state of the tech industry.
Yeah, but most of its processors were single purpose, so it was pretty limited in what it could actually do. For example, it couldn't do vector graphics, so the Star Wars port had to fake it with sprite graphics.
>it couldn't do vector graphics > Star Wars port had to fake it with sprite graphics
SW had a vector display, a TV is raster graphics. A lot of 2600 ports were lazy.
I mostly know about how physically limited the 2600 was from the Halo port (yes its real). Master Chief's visor in that was a pong ball because that was the only way they could add a second color to the sprite. The reason it worked is because the pong ball had uts own dedicated chip, meaning it didn't have to share colors with the rest of the system.
It could be just a mechanical instrument projected onto a screen, easy to do.
[...]
2600 was made to display sprites. It had limitations because it was made for one thing, sprites on TV.
2600 was specifically designed to play Pong style games. So it has sprites corresponding to the paddles and balls of Pong. More complicated games were created due to programmer ingenuity.
3 weeks ago
Anonymous
The Texas Instruments chip was. You can get round it though. The deeper problem is that its only representation of the screen is what it exposes directly to the display, there is no buffer. And you have to 'race the beam' because the whole thing got its timing off the display raster time, so you have to have changes to the screen ready before the phosphor gun gets there. Its functionally just about a computer but extremely close to the hardware. Fun to program for actually if you have the interest.
Software bloat and lazy code and scripting is the bane of modern processing yes.
A 1970s computer designed to do one thing, take 2 or 3 off set images, run a script to determine speed and distance based on the offset of those images, and then project that info is stupidly simple.
Running 300 apps and scripts in the background just to keep your os running while also constantly receiving and sending wireless data packs from the wifi, multiple Bluetooth devices, shared network devices, etc etc, while also running a game poorly coded by an overworked sleep deprived 23 year old is absolutely brutal in comparison.
Oh I agree completely, that why I specified "the same things". I was referring to simple tasks which computers have been able to do for decades being needlessly overcomplicated by moder devs.
Also >Game >Coded >In current year
Lmao
>>In current year >Lmao
I haven't touched the stuff since college in the late 2000s, you're right its all scripts going down now.
Unreal makes stuff so easy that code is limited to highly specific scenarios of design where clicking together function blocks won't cut it, and they're improving more and more to not need even that.
>They'll give you a novel length rant about lazy programmers and pajeets which is mostly accurate to the state of the tech industry.
I listened to rants like this for a couple decades. Had a /tg/ friend that tried to make a career of being a Navy officer but got laid-off during a Carter military cut back. Went back to college, grabbed a masters in programming in 1981, and jumped into the military industrial complex. I know one of the big things he programed on was the original Advanced Field Artillery Tactical Data System (AFATDS). I know the big disconnect he had with young people was the amount of emphasis and bloat put into GUIs. Fancy graphics that add nothing to the function and only rob possessing resources.
You know the adage of "work expands so as to fill the time available for its completion"?
A rather similar thing happens when it comes to compute resources and software. There's little impetus to highly optimize software when the hardware is so powerful. Like designing an incredibly comfy saddlebag for a horse to carry two and a half pounds on each flank.
I know, but its still mildly infuriating when I need a 3 ghz cpu and 8gb of ram to browse the web without slowdown when a 300mhz pentium 3 could the same exact thing 20 years ago.
PrepHole is borderline retarded. Don't ask them, for the sake of your sanity.
But to help out a bit. Highly specialized chips fabricated specifically to do one task only. That's what powered 70s/80s fighter jets
Nowadays with newer jets, it's mostly real-time computing.
The gist of RTC is to program applications in such a way that every function or calculation meets a deadline. Typically programmed with languages that are designed for this, like Ada.
https://en.m.wikipedia.org/wiki/Real-time_computing
They used real-time operating systems running on a cluster of interconnected instruments as opposed to a single computer running a general purpose OS. The latency is programmed away by bounding all computational tasks to a specific timeline.
This is the case for aircraft and many modern cars. You'll also find it powering factories, ships, and Martian rovers.
I know someone who tried to Kickstart one, it's not feasible for small players to do, and the various countries' road transport authorities don't (yet) allow the big boys to do HUDs. I suspect they worry about apps interfering with driver concentration. >because you just KNOW some zoomer is going to try and watch Tiktok on the windshield while doing eighty in a school zone
I know someone who tried to Kickstart one, it's not feasible for small players to do, and the various countries' road transport authorities don't (yet) allow the big boys to do HUDs. I suspect they worry about apps interfering with driver concentration. >because you just KNOW some zoomer is going to try and watch Tiktok on the windshield while doing eighty in a school zone
Yeah, they were quite basic. But the point is that
I know someone who tried to Kickstart one, it's not feasible for small players to do, and the various countries' road transport authorities don't (yet) allow the big boys to do HUDs. I suspect they worry about apps interfering with driver concentration. >because you just KNOW some zoomer is going to try and watch Tiktok on the windshield while doing eighty in a school zone
is clearly mistaken about there being some sort of big legal issue with HUDs in general. They have been done before. Hell, I just googled "cars with huds" and there are stacks of relatively recent cars with them. I have no idea how good they are, but they're certainly a thing.
That's me
I should have clarified: when I say HUD, I don't mean something that just shows basic dashboard displays. That's useless. I mean a HUD that projects a phone screen or car navigation screen. It would be very useful to display a map, you don't have to take your eyes off the road to read the map.
AFAICT, all HUDs currently don't have anything like that. They're quite basic.
3 weeks ago
Anonymous
So what is the exact wording of this law or policy which allows some HUDs and not others? What exactly is it that they are banning?
3 weeks ago
Anonymous
how the fuck do I know, I was just speculating
YOU tell me why no car manufacturer has produced even just the OPTION to put a HUD map up, when clearly it's well within technical capabilities
3 weeks ago
Anonymous
>I was just speculating
I figured.
I have no idea why no car makers have done this. I just call BS on the theory that the law is the problem.
3 weeks ago
Anonymous
More than likely no manufacturer wants to deal with insurance company bullshit that would accompany it.
Only know this because I got a loaner from the dealership but Mazda does them now, at least in their crossovers. It's pretty slick and way less intrusive than the gay ass digital displays and voice systems on other brands. They call it an active driving display or some shit.
Whenever a manufacturer decides to dick around and see if it'll be popular. Their current one's are actually pretty solid quality and don't get washed out by bright light, basically work like an RDS reflection. The one in my loaner can show a compact version of the full dashboard plus basic GPS and direction function. It'll also track the speed limit for a given road, upcoming stoplights and signs, and a few bit more things I"m sure I haven't seen yet.
Atari games could have extremely low latency, actually. The hardware did not have any memory to buffer things in, so the game had to read input as it was generated, advance game logic when it could and alter the image being emitted to the screen pretty much at the same time. This could result in latencies of 1/60 of a second or less, which is fine for most human users.
Modern machines are easily thousands of times faster, but they're also vastly more complicated than this. Nobody wants to program them like people did back then, and nobody wants to pay for competent software engineering either, so we get systems where updating the screen in response to a keypress takes longer than pinging another machine on the other side of the planet.
But latency doing something that basic isn't really much of a hardware performance issue.
In addition to its own radar the aircraft also has a radar receiver in it which detects when outside radar hits the plane. Certain types of signals and their behavior is indicative of a radar lock-on.
Dedicated detectors for radar, laser and other peculiar signatures.
They don't necessarily always work, especially with modern systems that can find the target using guidance data linked from somewhere else.
The aircraft knows where radar is because it knows where it isn't. By subtracting where it knows radar isn't, from where it knows radar is, it can get a clear picture of a radar lock. When a radar is activated, the airplane detects where the radar now is, and can subtract it from where it knows radar isn't. This can be called the radar direction indictator. When the radar source moves, the place where the radar wave were is now where the radar waves aren't, and conversely, the place where the radar waves are, is the place where the radar waves weren't. The plane then updates its knowledge and depiction of where the radar waves are so the pilot can avoid them and move to where they aren't.
Most of this was done with analog systems, wasn't it? No computers involved.
>analog
>hud
fuck off moron. This is the 80s we're talking about. Planes like the F-16 and F-18 literally can't fly without a digital computer controlling them.
the HUD can still be analogue gay
Not with numerical indications it can't.
You could use radio waves to create the numbers or use an electromechanical system involving holographics
It could be just a mechanical instrument projected onto a screen, easy to do.
2600 was made to display sprites. It had limitations because it was made for one thing, sprites on TV.
That UI is 100% fictional. Go look at DCS footage for actual 80s vintage tomcat HUD symbology.
Because it was all VERY highly specialized computer hardware. You had chips that are designed JUST to run this HUD and basically only ever do that. These are not general purpose computers and can't just be tasked to do anything significantly different.
>it was all VERY highly specialized computer hardware
Ok but I definitely remember seeing a video of dudes playing Star Wars X-Wing on an F-14's MPD.
it's just a Vectrex (1982)
>How did they have this no latency tech in the 80s
by the power of precision analog electronics
>80s
Been around since the 70s, starting with the A-7E
All that stuff is relatively simple to do, its just math and vector graphics, all you need is an 8 bit processor and some assembly code. Hell, you probably could do that on Atari (although not the 2600, it'd have to be one of their home computers or later consoles) Pic related is a screenshot from the game "Tempest" released by Atari in 1981.
If you want to know why tech today takes so much more effort to do the same things it did back in the 80s go ask PrepHole. They'll give you a novel length rant about lazy programmers and pajeets which is mostly accurate to the state of the tech industry.
Atari 2600 could produce some detailed graphics.
Yeah, but most of its processors were single purpose, so it was pretty limited in what it could actually do. For example, it couldn't do vector graphics, so the Star Wars port had to fake it with sprite graphics.
>it couldn't do vector graphics
> Star Wars port had to fake it with sprite graphics
SW had a vector display, a TV is raster graphics. A lot of 2600 ports were lazy.
I mostly know about how physically limited the 2600 was from the Halo port (yes its real). Master Chief's visor in that was a pong ball because that was the only way they could add a second color to the sprite. The reason it worked is because the pong ball had uts own dedicated chip, meaning it didn't have to share colors with the rest of the system.
2600 was specifically designed to play Pong style games. So it has sprites corresponding to the paddles and balls of Pong. More complicated games were created due to programmer ingenuity.
The Texas Instruments chip was. You can get round it though. The deeper problem is that its only representation of the screen is what it exposes directly to the display, there is no buffer. And you have to 'race the beam' because the whole thing got its timing off the display raster time, so you have to have changes to the screen ready before the phosphor gun gets there. Its functionally just about a computer but extremely close to the hardware. Fun to program for actually if you have the interest.
Software bloat and lazy code and scripting is the bane of modern processing yes.
A 1970s computer designed to do one thing, take 2 or 3 off set images, run a script to determine speed and distance based on the offset of those images, and then project that info is stupidly simple.
Running 300 apps and scripts in the background just to keep your os running while also constantly receiving and sending wireless data packs from the wifi, multiple Bluetooth devices, shared network devices, etc etc, while also running a game poorly coded by an overworked sleep deprived 23 year old is absolutely brutal in comparison.
Oh I agree completely, that why I specified "the same things". I was referring to simple tasks which computers have been able to do for decades being needlessly overcomplicated by moder devs.
Also
>Game
>Coded
>In current year
Lmao
>>Game
>>In current year
>Lmao
I haven't touched the stuff since college in the late 2000s, you're right its all scripts going down now.
Unreal makes stuff so easy that code is limited to highly specific scenarios of design where clicking together function blocks won't cut it, and they're improving more and more to not need even that.
>go ask PrepHole.
Jesus don't do that. They are by far the fucking dumbest fuckers on the PrepHole.
You're saying that like wartime PrepHole doesn't exist
>They'll give you a novel length rant about lazy programmers and pajeets which is mostly accurate to the state of the tech industry.
I listened to rants like this for a couple decades. Had a /tg/ friend that tried to make a career of being a Navy officer but got laid-off during a Carter military cut back. Went back to college, grabbed a masters in programming in 1981, and jumped into the military industrial complex. I know one of the big things he programed on was the original Advanced Field Artillery Tactical Data System (AFATDS). I know the big disconnect he had with young people was the amount of emphasis and bloat put into GUIs. Fancy graphics that add nothing to the function and only rob possessing resources.
You know the adage of "work expands so as to fill the time available for its completion"?
A rather similar thing happens when it comes to compute resources and software. There's little impetus to highly optimize software when the hardware is so powerful. Like designing an incredibly comfy saddlebag for a horse to carry two and a half pounds on each flank.
I know, but its still mildly infuriating when I need a 3 ghz cpu and 8gb of ram to browse the web without slowdown when a 300mhz pentium 3 could the same exact thing 20 years ago.
PrepHole is borderline retarded. Don't ask them, for the sake of your sanity.
But to help out a bit. Highly specialized chips fabricated specifically to do one task only. That's what powered 70s/80s fighter jets
Nowadays with newer jets, it's mostly real-time computing.
The gist of RTC is to program applications in such a way that every function or calculation meets a deadline. Typically programmed with languages that are designed for this, like Ada.
https://en.m.wikipedia.org/wiki/Real-time_computing
t. PrepHoleautist
indians and webshits can't program
Most military aircraft from at least the F14 Tomcat used custom made digital electronic integrated circuits.
The F14's CADC chipset had a 20 bit FPU capable of carrying out parallel multiplication and division operations, and an ADC.
They used real-time operating systems running on a cluster of interconnected instruments as opposed to a single computer running a general purpose OS. The latency is programmed away by bounding all computational tasks to a specific timeline.
This is the case for aircraft and many modern cars. You'll also find it powering factories, ships, and Martian rovers.
Martians dont exist idiot.
?t=196
it just werks
I'm surprised that reflector glass huds haven't become common on cars given how simple they are.
I know someone who tried to Kickstart one, it's not feasible for small players to do, and the various countries' road transport authorities don't (yet) allow the big boys to do HUDs. I suspect they worry about apps interfering with driver concentration.
>because you just KNOW some zoomer is going to try and watch Tiktok on the windshield while doing eighty in a school zone
The C5 and C6 Corvettes both had a HUD.
Not for projecting anything useful, like a map
Yeah, they were quite basic. But the point is that
is clearly mistaken about there being some sort of big legal issue with HUDs in general. They have been done before. Hell, I just googled "cars with huds" and there are stacks of relatively recent cars with them. I have no idea how good they are, but they're certainly a thing.
That's me
I should have clarified: when I say HUD, I don't mean something that just shows basic dashboard displays. That's useless. I mean a HUD that projects a phone screen or car navigation screen. It would be very useful to display a map, you don't have to take your eyes off the road to read the map.
AFAICT, all HUDs currently don't have anything like that. They're quite basic.
So what is the exact wording of this law or policy which allows some HUDs and not others? What exactly is it that they are banning?
how the fuck do I know, I was just speculating
YOU tell me why no car manufacturer has produced even just the OPTION to put a HUD map up, when clearly it's well within technical capabilities
>I was just speculating
I figured.
I have no idea why no car makers have done this. I just call BS on the theory that the law is the problem.
More than likely no manufacturer wants to deal with insurance company bullshit that would accompany it.
Only know this because I got a loaner from the dealership but Mazda does them now, at least in their crossovers. It's pretty slick and way less intrusive than the gay ass digital displays and voice systems on other brands. They call it an active driving display or some shit.
>They call it an ADD or some shit.
Fitting. Active Driving High-definition Display when?
Whenever a manufacturer decides to dick around and see if it'll be popular. Their current one's are actually pretty solid quality and don't get washed out by bright light, basically work like an RDS reflection. The one in my loaner can show a compact version of the full dashboard plus basic GPS and direction function. It'll also track the speed limit for a given road, upcoming stoplights and signs, and a few bit more things I"m sure I haven't seen yet.
They're starting to pick up traction, my 2019 BMW X5 has one
Surprisingly you don't need a 4090 to run an hud (which is just an powerpoint presentation)
Atari games could have extremely low latency, actually. The hardware did not have any memory to buffer things in, so the game had to read input as it was generated, advance game logic when it could and alter the image being emitted to the screen pretty much at the same time. This could result in latencies of 1/60 of a second or less, which is fine for most human users.
Modern machines are easily thousands of times faster, but they're also vastly more complicated than this. Nobody wants to program them like people did back then, and nobody wants to pay for competent software engineering either, so we get systems where updating the screen in response to a keypress takes longer than pinging another machine on the other side of the planet.
But latency doing something that basic isn't really much of a hardware performance issue.
stfu nerd youll never have sex lamo
but the pajeets are SOOOOOOO cheap!
Most families weren't willing to spend $15,000 on a console to play video games.
How does an aircraft know it's been locked on?
In addition to its own radar the aircraft also has a radar receiver in it which detects when outside radar hits the plane. Certain types of signals and their behavior is indicative of a radar lock-on.
Dedicated detectors for radar, laser and other peculiar signatures.
They don't necessarily always work, especially with modern systems that can find the target using guidance data linked from somewhere else.
The aircraft knows where radar is because it knows where it isn't. By subtracting where it knows radar isn't, from where it knows radar is, it can get a clear picture of a radar lock. When a radar is activated, the airplane detects where the radar now is, and can subtract it from where it knows radar isn't. This can be called the radar direction indictator. When the radar source moves, the place where the radar wave were is now where the radar waves aren't, and conversely, the place where the radar waves are, is the place where the radar waves weren't. The plane then updates its knowledge and depiction of where the radar waves are so the pilot can avoid them and move to where they aren't.
There was no latency on Atari vector graphics machines zoom-zoom.
You got HUDs with Atari level graphics XD