• SPORTS
    AND
    GAMING
  • Sports & Gaming Moderators: ghostfreak

PC PC Gamers - Whatcha playin?

History of Graphics cards


Computer video card history​

Updated: 11/30/2020 by Computer Hope
YearEvent
1981IBM developed their first two video cards, the MDA (Monochrome Display Adapter) and CGA (Color Graphics Adapter), in 1981. The MDA had 4 KB of video memory and the CGA had 16 KB of video memory.
1982Hercules Computer Technology, Inc. developed the HGC (Hercules Graphics Card), their answer to IBM's video cards. Hercules took the MDA standard from IBM and combined it with bitmapped graphics, and set the HGC with 64 KB of video memory.
1983Intel entered the video card market by introducing the iSBX 275 Video Graphics Multimodule in 1983. It was capable of displaying eight colors and a 256 x 256 resolution.
1984IBM introduced the PGC (Professional Graphics Controller) and the EGA (Enhanced Graphics Adapter) in 1984.
1987The Video Graphics Array (VGA) standard was released in 1987, providing a video resolution of 640 x 480 with 16 colors and up to 256 KB of video memory.
1987ATI introduced their first VGA video card, the ATI VGA Wonder. Some VGA Wonder cards even featured a mouse port for mouse connectivity.
1991S3 entered the video card market with their S3 911 and 911A graphics chips, which provided up to 256-color graphics quality. S3 mostly sold their graphics chips to OEM manufacturers to integrate into computer motherboards, due to their low cost and lower quality.
1992Developed by Silicon Graphics Inc., OpenGL was released in June 1992. OpenGL was utilized for the rendering of 2D and 3D vector graphics in video games, computer-aided design (CAD), virtual reality, and other applications.
1996The first of the Voodoo line of video cards was introduced in 1996 by 3dfx, called the Voodoo1. It required a 2D video card to be installed in a computer, allowing it to run alongside and provide 3D graphics rendering for computer gamers. Voodoo video cards quickly became popular amongst computer gamers.
1997NVIDIA released their RIVA 128 graphics accelerator chip, allowing video card manufacturers to incorporate 2D graphics and 3D graphics acceleration into their video cards. The RIVA 128 chip was intended to be NVIDIA's answer to the Voodoo1 video card, but it had lower quality graphics rendering.
19983dfx released the Voodoo2 video card in February 1998, replacing the Voodoo1. It was the first video card to provide SLI support, allowing two video cards to work together for superior graphics.
1999NVIDIA fully made their presence known in the video card market with the release of the GeForce 256 GPU (graphics processing unit) on October 11, 1999. It is considered as the first GPU worldwide and provided full support for DirectX 7. It also featured 32 MB of DDR memory.
2000ATI introduced the Radeon R100 series video cards in 2000, beginning a long lasting legacy of the Radeon line of video cards. The first Radeon video cards were fully DirectX 7 compatible and featured ATI's HyperZ technology.
2001NVIDIA released the GeForce 3 series of video cards in March 2001. The GeForce 3 series were the first video cards in the world to feature programmable pixel shaders.
2002ATI released the Radeon 9700 video card in October 2002, being the first Direct3D 9.0 accelerator video card on the market.
2006ATI was acquired by AMD in 2006. AMD no longer uses the ATI name for the Radeon video card series.
2010Audi began using the NVIDIA Tegra GPU to power the dashboard in their cars in 2010.
2013Sony and Microsoft released the PlayStation 4 and Xbox One respectively in 2013. Both gaming consoles used a GPU based on AMD's Radeon HD 7790 and 7850 video cards.
2020NVIDIA announced it was acquiring Arm for $40 billion on September 13, 2020.
2020NVIDIA released their highly anticipated RTX 30 series graphics cards in September and October 2020.
 
Intel 4004 CPU







From Wikipedia, the free encyclopedia




"4004" redirects here. For 4004 BC as the putative beginning of the world, see Ussher chronology. For 4004 BC in other contexts, see 5th millennium BC.
Intel 4004
Support status
History
Physical specifications
Architecture and classification
Performance
General information
Intel 4004 open.jpg
Open Intel 4004 processor
LaunchedNovember 15, 1971; 51 years ago
Discontinued1981
Common manufacturer(s)
  • Intel
Max. CPU clock rate740-750 kHz
Data width4 bits
Address width12 bits (multiplexed)
ApplicationBusicom calculator, arithmetic manipulation
Technology node10 μm
Instruction set4-bit BCD oriented
Transistors
  • 2,300
Package(s)
  • 16-pin DIP
Socket(s)
  • DIP16
Successor(s)Intel 4040
Unsupported
The Intel 4004 is a 4-bit central processing unit (CPU) released by Intel Corporation in 1971. Sold for US$60 (equivalent to $430 in 2022, $449.43 in 2023), it was the first commercially produced microprocessor, and the first in a long line of Intel CPUs.
The 4004 was the first significant example of large scale integration, showcasing the superiority of the MOS silicon gate technology (SGT). Compared to the incumbent technology, the SGT integrated on the same chip area twice the number of transistors with five times the operating speed. This step-function increase in performance made possible a single-chip CPU, replacing the existing multi-chip CPUs. The innovative 4004 chip design served as a model on how to use the SGT for complex logic and memory circuits, thus accelerating the adoption of the SGT by the world’s semiconductor industry. The developer of the original SGT at Fairchild was Federico Faggin who designed the first commercial integrated circuit (IC) that used the new technology, proving its superiority for analog/digital applications (Fairchild 3708 in 1968). He later used the SGT at Intel to obtain the unprecedented integration necessary to make the 4004.
The project traces its history to 1969, when Busicom Corp. approached Intel to design a family of seven chips for an electronic calculator, three of which constituted a CPU specialized for making different calculating machines. The CPU was based on data stored on shift-registers and instructions stored on ROM (read only memory). The complexity of the three-chip CPU logic design led Marcian Hoff to propose a more conventional CPU architecture based on data stored on RAM (random access memory). This architecture was much simpler and more general-purpose and could potentially be integrated into a single chip, thus reducing the cost and improving the speed. Design began in April 1970 under the direction of Faggin aided by Masatoshi Shima who contributed to the architecture and later to the logic design. The first delivery of a fully operational 4004 was in March 1971 to Busicom for its 141-PF printing calculator engineering prototype (now displayed in the Computer History Museum in Mountain View, California). General sales began July 1971.
A number of innovations developed by Faggin while working at Fairchild Semiconductor allowed the 4004 to be produced on a single chip. The main concept was the use of the self-aligned gate, made of polysilicon rather than metal, which allowed the components to be much closer together and work at higher speed. To make the 4004 possible, Faggin also developed the "bootstrap load", considered unfeasible with silicon gate, and the "buried contact" that allowed the silicon gates to be connected directly to the source and drain of the transistors without the use of metal. Together, these innovations doubled the circuit density, and thus halved cost, allowing a single chip to contain 2,300 transistors and run five times faster than designs using the previous MOS technology with aluminum gates.
The 4004 design was later improved by Faggin as the Intel 4040 in 1974. The Intel 8008 and 8080 were unrelated designs in spite of the similar naming.

 
amd 60% the price but 80%-105% the performance my nikka

Winner winner chicken dinner

yo boy bear

i had ATI when you was in elementary school, I had a HD 4870 x2 and failed a custom flash with incubation of GTX 280.

while were in a drug forum doesnt mean we all do drugs and dot


ATI was something, they more or less marketed RX 550 as ATI on CD-Drive driver ( Sapphire) now AMD is sapphire not ati, watch out for salt levels before food becomes ant prey. Why they weren't isolated and transfered to a cell of water body just to be spit out by a whale? They should been closed by now. Whole company.

AMD isnt. Is a reality you get jetlag by simply excuse of financial which I can accept. Or not now then a few post back you prove what I shaped into words. Which I'll not repeat.

be happy with ur ground fertilizer and wrap up ur shit and gtfo if you continue with this alternative attitude.
 
yo boy bear

i had ATI when you was in elementary school, I had a HD 4870 x2 and failed a custom flash with incubation of GTX 280.

while were in a drug forum doesnt mean we all do drugs and dot


ATI was something, they more or less marketed RX 550 as ATI on CD-Drive driver ( Sapphire) now AMD is sapphire not ati, watch out for salt levels before food becomes ant prey. Why they weren't isolated and transfered to a cell of water body just to be spit out by a whale? They should been closed by now. Whole company.

AMD isnt. Is a reality you get jetlag by simply excuse of financial which I can accept. Or not now then a few post back you prove what I shaped into words. Which I'll not repeat.

be happy with ur ground fertilizer and wrap up ur shit and gtfo if you continue with this alternative attitude.
Let's rephrase that for clarity ...

"How dare you !"

190923112443-04-greta-thunberg-un-0923.jpg
 
Been playing amnesia recently. The bunker got me pooping my pants.

can look between fingers.. and say most players do not know how to operate this game. I did a Hard run with normal first to scout map, bc objects shifts but blueprint of bunker doesn't. Like I said, hard run only lighter.. bunker used twice for.. *spoilers*. Around 10hrs bc I played about 20m per day. Had only 1 shotgun bullet and got another ending. I threw bunny within bunker.. let Lambert have ability to verbate
 
I can't manevur Resident Evil series as mature wallpaint, is just easter egg paint.. A machine that fails to contribute to mind and only for financial. KONAMI might be Saga of 23 century with this verbal approach.
 
Feel like upgrading my gpu to maybe 6700 xt or 7700 xt by xmas and buying 1440p screen

that's my setup now, i'm happy with it. at 27", i can't really tell the difference between 1440 and 4k visually, and i use the two side-by-side every day.

i don't think it's really worth going to 4k for anything less than 32", and at that point the price premium is excessive.

there aren't many good 4k options right now. if i could even find a 32" 4k with the specs of my 1440, i would have said fuck it and got that with 6950.
 
good call. 6700 or 6750 will probably be fine, i've run some AAA titles on the 6700 and there's no struggle at 1440. 1440 is not as big a step from 1080 as 4K is from 1440.

if it comes down to a choice between good display + 6700 or less-good display + 7700, i would prioritize the display.

i deal with display technologies a lot and can tell you for sure there's a big difference in image quality even between similar priced displays. it's important to get that right, especially if you play atmospheric titles and want to enjoy features like HDR.
 
I might ask you a thing or 2 about some fairly priced displays. Im hoping 7700 wont break the bank but I ll propably gonna have to settle with 6700
 
@thujone What do you think of rx 7600/6650xt for 1440p? About 13% slower than 6700xt

yea, most games are playable (>60 fps) at 1440 on 7600. it's just that most displays now do at least 144Hz refresh rate (aka 144 fps), so whether you think the 7600 is a 1080 card or a 1440 card really just depends on whether you can tell the difference between 60Hz and 144Hz.

some people swear they can tell the difference between 60 and 144 but there are games capped at 60 with millions of active players so it seems a bit like BS if someone were to say 60 is unplayable.

anyway, the other option is to use radeon super resolution, which is upscaling but works pretty well. you can try it on your 6600 - just pick a game where you can't run ultra settings smoothly at native resolution, enable RSR (set game resolution one mode lower than native) then see how it runs.

so yeh, it won't be able to handle as much as the 6700 but it's plenty capable and if you're looking at >$100 price difference then 7600 could be better value.
 
Top