1160°

Playstation 4 – Secondary Processor & 2Gb RAM Discovered + Video Discussion

The Playstation 4 console’s internals have been ripped apart, and inside reveals several hidden components including a secondary low power processor and 2Gb of DDR3 SDRAM.

Read Full Story >>
redgamingtech.com
Godmars2903834d ago (Edited 3834d ago )

Betting that will go away with the second gen production model.

Edit:
Wait, so this means the PS4 actually has 10GB of RAM?

Eonjay3834d ago

Check out the video with decent technical explanation.

mikeslemonade3834d ago

Now just wouldn't it be like a total surprise if the actual PS3 player was that hence the 256mb.

Anyway it could be used to just do miscellaneous apps like the web browser which is slow as you expect.

pedrof933834d ago (Edited 3834d ago )

We just found out that the Ps4 features don't eat 10 percent of the gpu, but actually it has its own hardware.

Computersaysno3834d ago (Edited 3834d ago )

Additional coprocessors were talked about on reveal.

ARM based coprocessor + dedicated memory means lower main processor overheads = win.

If we are lucky Sony have at least a modern dual core A9 or something in there. There is a good chance of this since AMD's chip maker spin off manufactures these.

Quite a reasonable little processor if it is designed to deal with low power states etc

GrizzliS19873834d ago

does this mean the ps4 has its own "cloud powered" extras but only within the hardware and not the "net"?

0ut1awed3834d ago (Edited 3834d ago )

I don't have time to currently watch but isn't this the processor used for video encoding and decoding. AKA the processor that allows for video streaming and remote play?

Sony already has said multiple times that they would have a second processor to do so in the PS4. They even pointed it out in their official disassembly video.

I would image the extra RAM is linked to that as well.

CryofSilence3834d ago (Edited 3834d ago )

GrizzliS1987, Shuhei Yoshida already confirmed that developers have the option of "cloud power" with the PS4. Microsoft just capitalized on it.

This effectively frees up more GDDR5 RAM/GPU/CPU for games, which is a great architectural philosophy.

SuicideKing3834d ago

I really dig when people break down the insides of consoles like this. It's interesting and I usually learn something.

nukeitall3834d ago (Edited 3834d ago )

@CryofSilence:

"GrizzliS1987, Shuhei Yoshida already confirmed that developers have the option of "cloud power" with the PS4. Microsoft just capitalized on it."

Having ability, is quite different from having the infrastructure.

The scale and availability of cloud differs significantly i.e. you could have a 100 servers in a farm, and call that a cloud. Doesn't make a good experience.

Heck, both PS3 (and by extension PS4) and Xbox 360 uses the cloud, just not at the same scale as indicated by Xbox One.

Thus you need the scale Amazon Web Services to match something like MS Azure cloud with built in features and tools to make it easy to harness cloud power.

Most game developers have little experience with cloud, as nobody can really afford it. That is why MS providing them free to game devs is a huge thing.

Scale, availability, tools and price is all at developers fingertip with MS Azure.

HelpfulGamer3834d ago

Sony Gaikai Cloud Gaming Processor!

CryofSilence3834d ago (Edited 3834d ago )

@nukeitall

A valid argument; however, it may actually work out contrary to expectations. Gaikai has a much more mature infrastructure than the Xbox cloud (i.e. it has existed longer), albeit possibly on a lower budget. Gaikai's focus currently is to stream entire PS3 games, so the infrastructure is/will be there. The efficiency of the service on either side has not been demonstrated, but it is safe to assume that they will offer similar performance gains.

P0werVR3833d ago (Edited 3833d ago )

@GrizzliS1987@CryofSilence

Nothing in PS4 has any of those capabilities. Only the Xbox One. PS4 will ONLY have Gaikai for game streaming. Remember PS4 doesn't need it, right?!

As expected NOTHING interesting here. Just a last minute increase of 4 to 8GB of RAM. No new technology, just additions. Except for the customized GCN architecture, but that goes for any console.

Oh boy things are start to get interesting in the coming years. No doubt Sony will bring out it's big guns with it's powerhouse first party studios and we'll see what Microsoft will come up with.

EDIT: @CryofSilnece

Gaikai is for "streaming", Azure is for computing tasks. There is A LOT more developers can do with Azure, like extra development tasks even after release. I want to see what Insomniac will do with Sunset Overdrive or Black Tusk Studios. You'll see how forward looking Microsoft were with their technology. Besides you need the servers to truly facilitate ANY Cloud based system and Microsoft HAS that "infastructure". Sony doesn't.

P0werVR3833d ago

^

Of course he would say that. The fact of the matter is, even looking at the design it shows clearly that have no capabilities. They have NO articles or explanations of such features. Yoshida is a human being with he's own intentions, not a God who speaks truth.

CryofSilence3833d ago (Edited 3833d ago )

@P0werVR

Since you saw fit to PM me what amounts to the same message (for whatever reason--I really don't care), I'll post both here and in PM. ;)

"The proof is in the pudding. ;) I haven't seen a convincing demonstration by either Microsoft or Sony, and I believe relying on the cloud for power is less than ideal to begin with. In theory, they can both provide the same service: rendering in the cloud and streaming the output image to the console either as a separate image or overlying mesh. Still, neither one of us knows remotely how these will work in real life, so let's not get volatile. :3"

+ Show (12) more repliesLast reply 3833d ago
Nitrowolf23834d ago

"facepalm"

Did you not read the article? This is the video uploading processor that SOny was talking about. Why would they remove something that they are pushing so hard for? It would basically render the share button useless without it

GarrusVakarian3834d ago (Edited 3834d ago )

2Gb, not 2GB

2Gb = 256MB

nix3834d ago

i thought GB or Gb meant the same. /:

WalterWJR3834d ago (Edited 3834d ago )

B - Byte
b - bit
8 bit = 1 byte

This is why Internet service providers advertise their speeds in bits. Bigger numbers sound better.

elmaton983834d ago

Walter thanks for the explanation good sir I myself didn't know the difference about them.

Madderz3834d ago

You'd think it would make more sense to just say 256MB so people don't get confused lol

Especially with all the GB penis measuring contests going on!! LMAO

BadlyPackedKeebab3834d ago

@Walter,

not strictly true, its to do with the way that the data is sent or used.

When things are talked about in bits, such as the internet its because data is sent one bit at a time, the line can only be in one of two states on or off aka 1 or 0.

If the communication mechanism could sent a whole byte in one state e.g. 10101010 then its speed would be talked about in bytes.

This is the real reason though things have been confused more recently. Though you are right I think BB providers use bits to con the average pleb who doesnt understand bits/bytes/kilobytes/kilobits etc.

+ Show (2) more repliesLast reply 3834d ago
B-radical3834d ago Show
NateCole3834d ago

b = bits
B = Bytes = 8bits

Therefore 2Gb/8bits = 256mb

Cobberwebb3834d ago (Edited 3834d ago )

256MB ;)

bigfish3834d ago

Yeh!....GB is not the same as Gb everyone knows that!.... noob.

bigfish3834d ago

I was being sarcastic by the way..lol. give me more disagrees I love it!!!

BadlyPackedKeebab3834d ago

Not to even get into the fact there are 1024 bits in a kilobit etc and not a square thousand etc.

+ Show (1) more replyLast reply 3834d ago
RyuCloudStrife3834d ago

Bro get ready to be schooled by people who know everything lol anyways this is great news...

GraveLord3834d ago

2Gb = 256MB
Gigabit /=/ Gigabyte

So PS4 has 8GB of GDDR5 and 256MB of DDR3.

JsonHenry3834d ago

I figured this was the case about background streaming having its own dedicated setup. (and then they confirmed it previously) But I wonder if it handles anything else? 256mb of RAM seems excessive for such a simple task alone so I am sure it does more.

This reminds me of when they found the Ana chip in the 360 which allows it to upscale anything independent of the CPU.

Glad I ordered my PS4. Can't wait till Wed. to play it.

iceman6003834d ago

no it's for the video card.

CryofSilence3834d ago

It's Gigabits, not gigabytes, so divide 2 Gb by 8, and that's how much ddr3 SDRAM it has. As the article says, it likely facilitates or allows for suspend/resume and 15 minute continuous recording.

Ol_G3834d ago (Edited 3834d ago )

think so too wii u also has 512MB flash onboard probably for the same stuff
http://www.ifixit.com/Teard...

DoctorXpro3834d ago

means you need to go back to school...

xtremeimport3834d ago

I like knowing that Sony has actually thought this thru. My favourite part about the lead up to the launch were the skeptics saying "Sony is lying to you all, they wont even go into detail about the internal specs"

It seems like they were confident enough in what the Ps4 has to offer that when MS broke down their internals, they didn't need to "fluff their feathers" in defense.

I love my Ps4 and if MS wants to make the xbox appealing to me again, they're really gonna have to bring something special to the table. Sony launching first and cheaper was a HUGE win for them.

Mosiac773834d ago

It don't look like is on the same unity as the other 6 are and is DDR so I wonder what sony has up their sleeves

assdan3834d ago

It has 2gigabits, not bytes. So more like 256mb.

JasonKCK3834d ago

Godmars290 dude why didn't you just watch it?

+ Show (14) more repliesLast reply 3833d ago
kewlkat0073834d ago

This must be that article where PC people can chip in on hardware terms..

MONOLITHICIDE3834d ago

You mean the supposed master race of technology

JasonKCK3834d ago

Many PC gamers build the rigs they play on. You crapping on their knowledge shows how little you have. That's like crapping on a car mechanic because they can rebuild an engine and you can't.

grahf3833d ago

Console gamers have a bone to pick with PC gamers anyway.
After all, its not like mechanics/gear-heads crap on people for going out and buying pre-manufactured cars.
"You could build a car for less and with better performance than that Prius!"
That doesn't happen.

jackanderson19853834d ago

i thought sony (could have been digital foundry) already said that this secondary processor is used for the low power state to enable the console to update while in standby?

BG115793834d ago (Edited 3834d ago )

The same thing here. The secondary chip was announced a long time ago. People probably thought that it was one of the processors core that was going to be used for the system and not a totally different processor.

joeorc3834d ago (Edited 3834d ago )

@jackanderson1985

i thought sony (could have been digital foundry) already said that this secondary processor is used for the low power state to enable the console to update while in standby?

that's because it can!, this most likely is the Arm Cortex A5 TrustZone CPU that AMD was including on the APU's

see:

AMD announced a partnership with ARM that will see ARM's Cortex-A5 processor architecture implemented with AMD's goal in adopting ARM's Cortex-A5 and TrustZone security is to "provide a consistent approach to security spanning billions of internet-connected mobile devices, tablets, PCs and servers," according to the release.

TrustZone security will be featured on select x86 APUs next year, with a wider variety of products supported in 2014.future x-86 chips in a move to boost security.

http://www.techradar.com/us...

Not only could the use of this Cortex A5 be for trustzone, but also its fully programmable. its a beffy lil SOC in of itsself. the fact it has 256 MB all to itself..its quite a bit of Ram.

its on HSA production silicon also

http://www.xbitlabs.com/pic...

http://www.xbitlabs.com/new...

joeorc3834d ago (Edited 3834d ago )

who ever did disagree with my post, i would love to see why, please respond and tell me, why what i posted is wrong? remember i have seen the SOC PRODUCTION STAMP'S ON THE OUTSIDE OF THE CHIP!

ITS THE CXD90025G with production number

88EC120-BNS2 stamped right on this soc!

here is the link to the ram for this lil chip

http://www.samsung.com/glob...

of course another phantom disagree with no response..what a shocker..lol

ATiElite3834d ago (Edited 3834d ago )

Good post my friend.

The Arm A5 for trustzone sounds like what it is because trustzone is something that Netflix and Credit card companies like to see when doing business over the internet which is what the PS4 will be doing.

also the chip can be used to do small background task as it is programmable.

So i see the chip handling background Internet Services and working as Trustzone to secure data and ALSO if Sony wanted to, to act as a BLOCK on used Games.

You provided a ton of Data unlike most people on here and stealth disagrees only mean the people doing it DON'T have a clue.

Good post!

ThanatosDMC3834d ago

N4G golden rule:

If you complain about people disagreeing with your comment, more people will press disagree because it's funny.

dcbronco3834d ago

ThanatosDMC I disagreed with your comment. But I gave you a bubble up for a intelligent comment. After all, this is N4G.

Pogmathoin3834d ago

Thana and Bronco... I am confused now.... Yay or nay.... Agree or disagree.... But for acknowledging the truth about N4G... Bubble up each....

MikeGdaGod3833d ago

+bubble @ThanatosDMC

on topic: this is awesome

+ Show (3) more repliesLast reply 3833d ago
FamilyGuy3834d ago

They mentioned the secondary processor but never went in to detail on things like the memory it uses or how much power it (alone) draws in that low power state.

Pretty crazy how the PS3 operates games off 256Mbs of memory and here in the PS4 Sony throws that amount at just doing the background and standby mode task.

assdan3834d ago

They did say that, but people are stupid, and assumed it was wrong. Still cool, and I didn't know it had it's own ram.

+ Show (1) more replyLast reply 3833d ago
NateCole3834d ago

Before anyone blows a gasket. Its Gb NOT GB.

B-radical3834d ago

Honestly should of just titled it 256mb and not try to fancy it up with Gb.....Joe dirtè

DoctorJones3834d ago

Why don't they just say 256MB?

Intentions3834d ago

Cause it sounds inferior. In the technology industry, people like smaller numbers. It's also a standard way of making/writing a report etc, kind of like measurements etc.

kewlkat0073834d ago

Why say high resolution when you can say retina..

DoctorJones3834d ago

Because 'high resolution' is a descriptive term for resolution whereas Retina is just a brand name for Apples displays. Most people would consider a retina to be part of your eye though.

Hardly a useful comparison in terms with what we are talking about.

iamnsuperman3834d ago (Edited 3834d ago )

Like what Doctor Jones said retina is a brand name and not really a descriptive term for resolution. It is merely a buzzword (which is trademarked) and is why not many (if any) other companies use it (They talk about pixel density, which in a way is another buzzword, or resolution)

kewlkat0073834d ago

You guys didn't get it..

Show all comments (135)
270°

AMD gaming revenue declined massively year-over-year, CFO says the demand is 'weak'

Poor Xbox sales have affected AMD’S bottom line

Read Full Story >>
tweaktown.com
RonsonPL14d ago

Oh wow. How surprising! Nvidia overpriced their RTX cards by +100% and AMD instead of offering real competition, decided to join Nvidia in their greedy approach, while not having the same mindshare as Nvidia (sadly) does. The 7900 launch was a marketing disaster. All the reviews were made while the card was not worth the money at all, they lowered the price a bit later on, but not only not enough but also too late and out of "free marketing" window coming along with the new card generation release. Then the geniuses at AMD axed the high-end SKUs with increased cache etc, cause "nobody will buy expensive cards to play games" while Nvidia laughed at them selling their 2000€ 4090s.
Intel had all the mindshare among PC enthusiasts with their CPUs. All it took was a competetive product and good price (Ryzen 7000 series and especially 7800x3d) and guess what? AMD regained the market share in DYI PCs in no time! The same could've have happened with Radeon 5000, Radeon 6000 and Radeon 7000.
But meh. Why bother. Let's cancell high-end RDNA 4 and use the TSMC wafers for AI and then let the clueless "analysts" make their articles about "gaming demand dwingling".

I'm sure low-end, very overpriced and barely faster if not slower RDNA4 will turn things around. It will have AI and RT! Two things nobody asked for, especially not gamers who'd like to use the PC for what's most exciting about PC gaming (VR, high framerate gaming, hi-res gaming).
8000 series will be slow, overpriced and marketed based on its much improved RT/AI... and it will flop badly.
And there will be no sane conclusions made at AMD about that. There will be just one, insane: Gaming is not worth catering to. Let's go into AI/RT instead, what could go wrong..."

Crows9014d ago

What would you say would be the correct pricing for new cards?

Very insightful post!

RonsonPL13d ago

That's a complicated question. Depends on what you mean. The pricing at the release date or the pricing planned ahead. They couldn't just suddenly end up in a situation where their existing stock of 6000 cards is suddenly unsellable, but if it was properly rolled out, the prices should be where they were while PC gaming industry was healthy. I recognize the arguments about inflation, higher power draw and PCB/BOM costs, more expensive wafers from TSMC etc. but still, PC gaming needs some sanity to exist and be healthy. Past few years were very unhealthy and dangerous to whole PC gaming. AMD should recognize this market is very good for them as they have advantage in software for gaming and other markets while attractive short term, may be just too difficult to compete at. AI is the modern day gold rush and Nvidia and Intel can easily out-spend AMD on R&D. Meanwhile gaming is tricky for newcomers and Nvidia doesn't seem to care that much about gaming anymore. So I would argue that it should be in AMDs interest to even sell some Radeon SKUs at zero profit, just to prevent the PC gaming from collapsing. Cards like 6400 and 6500 should never exist at their prices. This tier was traditionally "office only" and priced at 50$ in early 2000s. Then we have Radeons 7600 which is not really 6-tier card. Those were traditionally quite performant cards based on wider than 128-bit memory bus. Also 8GB is screaming "low end". So I'd say the 7600 should've been available at below 200$ (+taxes etc.) as soon as possible, at least for some cheaper SKUs.For faster cards, the situation is bad for AMD, because people spending like $400+ are usually fairly knowledgable and demanding. While personally I don't see any value in upscallers and RT for 400-700$ cards, the fact is that especially DLSS is a valuable feature for potential buyers. Therefore, even 7800 and 7900 cards should be significantly cheaper than they currently are. People knew what they were paying for when buying Radeon 9700, 9800, X800, 4870 etc. They were getting gaming experience truly unlike console or low-end PC gaming. By all means, let's have expensive AMD cards for even above $1000, but first, AMD needs to show value. Make the product attractive. PS5 consoles can be bought at 400$. If AMD offers just a slightly better upscalled image on the 400$ GPU, or their 900$ GPU cannot even push 3x as many fps compared to cheap consoles, the pricing acts like cancer on PC gaming. And poor old PC gaming can endure only so much.

MrCrimson13d ago

I appreciate your rant sir, but it has very little to do with gpus. It is the fact that the PS5 and Xbox are in end cycle before a refresh.

RonsonPL13d ago

Yes, but also no. AMD let their PC GPU marketshare to shrink by a lot (and accidentally helped the whole market shrink in general due to bad value of PC GPUs over the years) and while their console business may be important here, I'd still argue their profits from GPU division could've been much better if not for mismanagement.

bababooiy13d ago

This is something many have argued over the last few years when it comes to AMD. The days of them selling their cards at a slight discount while having a similar offering are over. Its not just a matter of poor drivers anymore, they are behind on everything.

RNTody13d ago (Edited 13d ago )

Great post. I went for a Nvidia RTX 3060Ti which was insane value for money when I look at the fidelity and frame rates I can push in most games including new releases. Can't justify spending 3 times what my card cost at the time to get marginal better returns or the big sell of "ray tracing", which is a nice to have feature but hardly essential given what it costs to maintain.

+ Show (1) more replyLast reply 13d ago
14d ago Replies(1)
KwietStorm_BLM14d ago

Well that's gonna happen when you don't really try. I want to support AMD so badly and give Nvidia some actual competition but they don't very much seem interested in challenging, by their own accord. I been waiting for them to attack the GPU segment the same way they took over CPU, but they just seem so content with handing Nvidia the market year after year, and it's happening again this year with their cancelled high end card.

MrCrimson13d ago

I think you're going to see almost zero interest from AMD or Nvidia on the gaming GPU market. They are all in on AI.

RhinoGamer8814d ago

No Executive bonuses then...right?

enkiduxiv13d ago

What are smoking? Got to layoff your way to those bonuses. Fire 500 employees right before Christmas. That should get you there.

Tapani13d ago (Edited 13d ago )

Well, if you are 48% down in Q4 in your Gaming sector as they are, which in absolute money terms is north of 500M USD, then you are not likely to get at least your quarterly STI, but can be applicable for annual STI. The LTI may be something you are still eligible for, such as RSUs or other equity and benefits, especially if they are based on the company total result rather than your unit. All depends on your contract and AMD's reward system.

MrCrimson13d ago

Lisa Su took AMD from bankruptcy to one of the best semiconductor companies on the planet. AMD from 2 dollars a share to 147. She can take whatever she wants.

Tapani13d ago

You are not wrong about what she did for AMD and that is remarkable. However, MNCs' Rewards schemes do not work like "take whatever you want, because you performed well in the past".

darksky14d ago

AMD prcied their cards thinking that they will sell out just like in the mining craze. I suspect reality has hit home when they realized most gamers cannot afford to spend over $500 for a gpu.

Show all comments (33)
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin24d ago

Best for the money is the Arc cards

just_looken24d ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.

270°

AMD FSR 3.1 Announced at GDC 2024, FSR 3 Available and Upcoming in 40 Games

Last September, we unleashed AMD FidelityFX™ Super Resolution 3 (FSR 3)1 on the gaming world, delivering massive FPS improvements in supported games.

Read Full Story >>
community.amd.com
Eonjay55d ago (Edited 55d ago )

So to put 2 and 2 together... FSR 3.1 is releasing later this year and the launch game to support it is Rachet and Clank: Rift Apart. In Sony's DevNet documentation it shows Rachet and Clank: Rift Apart as the example for PSSR. PS5 Pro also launches later this year... but there is something else coming too: AMD RDNA 4 Cards (The very same technology thats in the Pro). So, PSSR is either FSR 3.1 or its a direct collaboration with AMD for that builds on FSR 3.1. Somehow they are related. I think PSSR is FSR 3.1 with the bonus of AI... now lets see if RDNA 4 cards also include an AI block.

More details:
FSR 3.1 fixes Frame Generation
If you have a 30 series RTX card you can now use DLSS3 with FSR Frame Generation (No 40 Series required!)
Its Available on all Cards (we assume it will come to console)
Fixes Temporal stability

MrDead54d ago

I've been using a mod that allows dlss frame gen on my 3080 it works on all rtx series. It'll be good not to rely on mods for the future.

darksky53d ago

The mods avaiable are actually using FSR3 frame gen but with DLSS or FSR2 upscaling.

Babadook753d ago (Edited 53d ago )

I think that the leaks about the 5 Pro would debunk the notion that the two (FSR 3.1 and PSSR) are the same technology. PSSR is a Sony technology.

MrDead54d ago (Edited 54d ago )

I wonder how much they fixed the ghosting in dark areas as Nvidia are leaving them in the dust with image quality. Still good that they are improving in big leaps, I'll have to see when the RTX5000 series is released who I go with... at the moment the RTX5000's are sounding like monsters.

just_looken54d ago

Did you see the dell leaks were they are trying to cool cards using over 1k watts of power.

We are going to need 220 lines for next gen pcs lol

MrDead54d ago

That's crazy! Sounds like heating my house won't be a problem next winter.

porkChop53d ago

As much as I hate supporting Nvidia, AMD just doesn't even try to compete. Their whole business model is to beat Nvidia purely on price. But I'd rather pay for better performance and better features. AMD also doesn't even try to innovate. They just follow Nvidia's lead and make their own version of whatever Nvidia is doing. But they're always 1 or 2 generations behind when it comes to those software/driver innovations, so Nvidia is always miles ahead in quality and performance.

MrDead53d ago

I do a lot of work on photoshop so an Intel Nvidia set up has been the got to because of performance edge, more expensive but far more stable too. Intel also have the edge over AMD processors with better load distribution on the cores, less spikes and jitters. When you're working large format you don't want lag or spikes when you're editing or drawing.

I do think AMD has improved massively though and whist I don't think they threaten Nvidia on the tech side they do make very well priced cards and processors for the power. I'm probably going with a 5080 or 5090 but AMD will get a little side look from me, which is a first in a long time... but like you said they are a generation or two behind at the moment.

Goosejuice53d ago

While I can't argue for amd gpu, they aren't bad but they aren't great either. The cpu for amd have great. I would argue the 7800x3d as one of the best cpu for gaming right now. Idk about editing so I take ur word for that but gaming amd cpu is a great option these days.

porkChop52d ago

@Goosejuice

I have a 7800X3D. It certainly is great for gaming. Though for video editing, rendering, etc, I think Intel have the advantage from what I remember. I just mean from a GPU standpoint I can't support them.