Apple's marketing department is a BEAST!!! They'll never let a simple thing like the TRUTH get in the way of marketing their products!!!!
@theTechNotice
2 жыл бұрын
🤦
@kb420ps
2 жыл бұрын
@@theTechNotice lmao!!!
@ItumelengS
2 жыл бұрын
Truth is inconvenient at times
@Teluric2
2 жыл бұрын
Its true. Apple with many youtubers said M1 is faster than i9 and was a lie
@tiromandal6399
2 жыл бұрын
@@Teluric2 You mean "iShills"?
@niceone1456
2 жыл бұрын
Apple engineer: the 64 cores is a beast in video encoding, it outperforms RTX 3090. BUT it’s nothing near a Nvidia gpu performance in other fields. Apple marketing genius: ok outperform Nvidia gpu
@ja-kidnb6416
2 жыл бұрын
yeah JUST video encoding. I had to edit on an M1 Ultra and it basically was an annoyance compared to my 5950x with its 2060 Super. Never again... (and I loved my 2011 Macbook Pro!)
@chaos7360
2 жыл бұрын
not even that, my coworker has an m1 ultra and 4x slower than my PC in video rendering
@rebel2809
2 жыл бұрын
i feel like thats genuinely what happened. that sounds like such an apple thing to do.
@fayenotfaye
2 жыл бұрын
@@ja-kidnb6416 youre probably using an application that either isn't optimised for AS or doesnt support a codec that the m1 ultra is efficient at encoding
@destinyhero4795
2 жыл бұрын
Me with 1050ti and 2nd gen Ryzen, "hey, m1 ultra is not that bad :("
@attilaherbert6227
2 жыл бұрын
I think what most people missed is the axis description: "Relative performance". Basically, this graph means absolutely nothing, it's a fancy gimmick that makes your low-power computer look strong.
@MrFWStoner
2 жыл бұрын
Thank you! 🙏
@iffy_too4289
2 жыл бұрын
Exactly. I saw that and thought; WTF, relative to what? It cant be power as it's the other axis, so what is it?
@noturbeezwaxbeaulac1383
2 жыл бұрын
no. it shows that the system does NOT consume as much electrical power. period.
@jovaraszigmantas
2 жыл бұрын
And even then it is still not up to bar. At times it was drawing 4 times more power and 5 better benchmarks meaning that the relative performance is not always better on Apple. But you are definitely correct, comparing anything mobile to a desktop grade will always be useless as it is not fair apples to apples comparison.
@jessepatterson8897
2 жыл бұрын
it's comparing performance at TDPs no one lied remotely, but good job falling for click bait
@sunraiii
2 жыл бұрын
Buy the product, not the marketing, my friends. Always good to double-check companies' claims. Great video!
@arjunsingh68112
2 жыл бұрын
Nvidia/Intel/Amd wont give the flex Apple provides. My 1050ti mobile stands still a better than most m1 mac out their for fraction of the price.
@goshu7009
2 жыл бұрын
@@arjunsingh68112 Every computer is better then Mac. Thats because Mac is not a computer, right? AT least thats what htey say?
@xjet
2 жыл бұрын
The M1 uses one fifth the power (instantaneous measurement) but takes five times as long -- therefore the total power consumption per task is about the same. No significant win there either 😞
@zadsazhad
2 жыл бұрын
Ikr
@VatchGaming
2 жыл бұрын
what about the entry level?
@zadsazhad
2 жыл бұрын
@@VatchGaming m1 is good enough for everyday use with great battery life but, the optimization still isn't there and u can still get better performance for the price but the trade off are battery and audio
@Real_MisterSir
2 жыл бұрын
Except that with the M1 you will never have fast workflow, so it's a net loss for Apple in that regard. What you have is portability, that's about it. For any professional workload outside of specific media encoding that the M1 is optimized for, as a professional you are going to lose money by using the M1 simply because of the time you're wasting, and how the performance per watt doesn't scale up linearly (aka the more performance you require, the worse the power consumption for that performance gets). And then there's the price, which isn't in the M1's favor either. Nor is the fact that many professional programs for rendering allow for cpu+gpu rendering, so you aren't losing out by having a separate cpu and gpu in your system.
@Real_MisterSir
2 жыл бұрын
@@zadsazhad But even for everyday use and battery life, who spends 1k over the regular M1 which is already incredibly pricey..
@super_hero2
2 жыл бұрын
Do basically the 3090 is 5x faster than the Ultra but draw 5x more power, that is pretty much linear scaling. Good job NVIDIA.
@JustRuda
2 жыл бұрын
Yes, but you can still optimize the voltage. I just changed the voltage on RTX 3080 12G, I now have about 100W less power consumption, but the performance is literally almost the same (3-5% difference),
@tkpenalty
2 жыл бұрын
@@JustRuda Also add to the fact that Samsung 8nm isn't a great process compared to TSMC 5nm / N7
@whitedawn215
2 жыл бұрын
Power doesn't generally scale linearly, the same voltage through an M1 chip would not give it nearly the same performance.
@super_hero2
2 жыл бұрын
@@whitedawn215 It means that the Ultra still have a long way to catch up to NVIDIA. It can't even beat NVIDIA in efficiency in these tests. It uses 5x lower wattage but also 5x slower. That is a terrible performance for the Ultra.
@juslenjeyatharan1004
2 жыл бұрын
@@super_hero2 Actually, that’s on par with power-per-watt. It’s just that the Nvidia can go 5x faster.
@vlogsingh
2 жыл бұрын
You cant play games with M1 , but 3090 can
@YouTube.Pigeon
2 жыл бұрын
Parjeet
@pirojfmifhghek566
2 жыл бұрын
If "efficiency" is all the M1 has going for it, has anyone tried matching the M1's performance with lower grade parts? The 3050 can smoke the thing and it only draws 130w. Put them side by side and see where that gets you. Is that 30-40w of power savings worth the extra three thousand dollars you spent on a device you can never DIY repair?
@noturbeezwaxbeaulac1383
2 жыл бұрын
ya the M1 is not a gaming video card. it a all in one chip. that is strong for its type. you want to compare it to anyt thing in the PC world, start looking at motherboards with integrated video. and not plugin video cards.
@pirojfmifhghek566
2 жыл бұрын
@@noturbeezwaxbeaulac1383 That's the space that the chip itself exists in, but that's not what the desktop computing space is. The M1 makes sense for cheap and light ipads and laptops--where space and power is at a premium--but if you have even a little leeway to add a dedicated GPU to the system it absolutely trounces Apple's all-in-one design. And that's not good for a system like the M1 Studio, which was designed AND MARKETED to compete directly with other desktop behemoths. Unfair comparison? That's the comparison Apple is trying to sell us on. And they fudged a lot of numbers to feed us that lie.
@gamechannel1271
2 жыл бұрын
@@noturbeezwaxbeaulac1383 Apple decided to start the discrete GPU comparison, not us.
@akshobhyamanthati2304
2 жыл бұрын
@@pirojfmifhghek566 hes talking about the m1 studio the one for pros so performace matters more than power effiency
@noturbeezwaxbeaulac1383
2 жыл бұрын
@@gamechannel1271 Apple as been comparing it product long a go and it's normal. and if you do want to compare things do make the point of keeping the correct comparaison. Otherwise this was good click bait.
@kewk
2 жыл бұрын
I was appalled at the real world performance vs what they stated so I returned mine shortly after I bought it. It essentially kept up with my rtx 2070 laptop. Where as my main pc smoked it in everything.
@kevinweber5129
2 жыл бұрын
It’s essentially a laptop chip doubled. If they wanted power they would be pumping more power through it and running the chip faster. It’s limited just like the TrashCan make - atleast they made accommodations for a Real GPU there. What I think is a big loss is that you can’t buy it with one MAx chip in it and then add a second one it with later.
@trevorlafave
2 жыл бұрын
@@kevinweber5129 For video editing, music production, and photo editing, this is still one of the better options. A lot of producers use logic pro, and many creators use final cut. The thing is, most of these use cases don’t require that much GPU power. You can barely game using MacOS, so you will probably not need the extra GPU power anyway. The CPU is what makes the M1 Ultra great.
@jakejakedowntwo6613
2 жыл бұрын
@@trevorlafave make sense, M1 is marketed as a media machine and the 3090 is a gaming card
@kevinweber5129
2 жыл бұрын
@@trevorlafave I’ve read comments that logicPro plug-ins have had problems so it doesn’t work for some. I guess it’s great if final cut is your one and only software.
@noturbeezwaxbeaulac1383
2 жыл бұрын
@@kevinweber5129 jesus, it don't work like that. the 2 chips in the ultra are soldered together. C'ant "just plug" a second one. its not like the old dual systems. Good luck making them 0.5 nano solder joints
@Mark-yp9dl
2 жыл бұрын
Clearly apple made the comparison of their m1 ultra chip against the gtx 1650.
@jasonsykes4199
2 жыл бұрын
Don't give apple credit. they clearly benchmarked it against an original Gateway computer.
@DigitalImageWorksVFX
2 жыл бұрын
In terms of desktop computers, I'm not interested how much power they consume. I'm interested how they perform. Apple still is far behind in terms of GPU raw performance, but future may be interesting.
@simplyruben3184
2 жыл бұрын
power consumption should matter, especially nowadays though, energy bills are higher then ever so having a gpu that can push fast refresh rates at the cost of drawing 900w (coughs in 4090) yeah no.. id rather not pay $1000+ monthly for power. efficiency matters, the more efficient a card is the better its performance ontop of reduced power bills, but ig your family pays the bills so you dont care
@llothar68
2 жыл бұрын
Thats why we need a law for this. It's insane to have more than 1W in standby and more than 20W in sleep. Intel/ATX mainboards have 20W in standby.
@flashback4588
2 жыл бұрын
@@simplyruben3184 maybe for the average consumer but i think these workstations are meant for big companies with millions of dollars were performance is more important than their electricity bills
@spirit9087
2 жыл бұрын
@@simplyruben3184 bruh you going to wait 3d render 5x longer on mac then on pc and you still have to pay the same bill, but you also have to pay bill 5x for monitor's electro on mac if you talking about power consumption
@GUY-on-Earth
2 жыл бұрын
true but first they need 10 years of milking and then they’ll start to mean what they say, in which by that time they’ll still be behind on innovation
@iancurrie8844
2 жыл бұрын
Apple lies all the time. They claimed that the M1 ran photoshop with 8GB of ram better than pcs with 32GB and a bunch of bought and paid for KZitemrs backed them up. They claimed that someone having unified memory allowed an 8GB system to do miracles and it’s all you’d ever need. That’s absolutely impossible. As a heavy photoshop user, I can tell you that some projects can exceed 16GB in a single instance of the program. No amount of “unification” can make that work on 8GB total system memory. Then when the higher end M1 chips came out those same KZitemrs now claim that they fixed the slow productivity of the M1 which didn’t, apparently, exist until the higher ones launched. It’s all a pack of lies. Look at cinebench. They rand the lowly core i5 12400 above the M1 max.
@itsmeadmiral
2 жыл бұрын
where is your evidence? or is this just more speculation?
@iancurrie8844
2 жыл бұрын
@@itsmeadmiral It's all over youtube. Just have a look. Watch the initial M1 reviews and now the Max ones. The narrative changes. It's all lies. As for the cinebench, those benchmarks are freely available.
@itsmeadmiral
2 жыл бұрын
@@iancurrie8844 got it.
@priyadarsihalder9242
2 жыл бұрын
just blindly trust apple and you will be fast than superman himself. The thing you need the most is trust blindly on apple 🙂.
@cheeeeezewizzz
2 жыл бұрын
Gotta say though, for us lighter photoshop users affinity photo on the iPad is soo much more approachable than Adobe is
@RealityStudioLLC888
2 жыл бұрын
My best guess is Apple did a video editing "benchmark" that decoded/encoded from/to Pro Res and had no effects whatsoever. That way the M1 Ultra gets to use all it's Pro Res hardware decoders/encoders whereas the NVidia 3090 can't.
@llothar68
2 жыл бұрын
As. far as i remember and i watch the Apple events since 2006 when they moved to Intel, Apple never compared GPU computing power. They always mean encoding.
@Teluric2
2 жыл бұрын
@@llothar68 its not the first time Apple use deceptive marketing and lies.
@PanosPitsi
2 жыл бұрын
@@Teluric2 or any other company for that matter. Remember how ray tracing would make video games look like real life or how the ps5 would compete with 2k dollar pcs? Or how huawei would make an android competitor ? Or how cyberpunk would be the game of the decade ? Apple saying their gpu is good in some irrelevant test was to be expected their gpus always sucked so this would be no different
@krellin
2 жыл бұрын
well to be fair, the fact that CPU and GPU with apple are integrated makes a huge difference in real world workflow when both are involved in whatever you're doing I agree that the graph is complete crap but make no mistake no one comes even close when it comes to the final product, I cant buy any other laptop because everything is garbage both in software/build and performance department I have PC with 3090 which I use for gaming, but apple MacBook pro I use for everything + more and more gaming on it as well I prefer PC for ergonomics and my monitors, but it is obsolete if I ignore those reasons and the price of the mac My furutre setup is just MacBook pro with windows BootCamp for gaming
@muthukumarannm398
2 жыл бұрын
@@krellin what you said doesn’t make sense. I read twice.
@No-mq5lw
2 жыл бұрын
Don't forget there's the 3060 Laptop, which takes up 80w on paper (out of a max 180W), with an OpenCL Geekbench 5 score of 100903, CUDA score of 108249, and Blender monster 1249.47, junk of 738.999, and classroom of 641.888. V Ray CUDA is at 873. Sample size of 1 btw. Did I also mention that this system costed me around $900?
@No-mq5lw
2 жыл бұрын
@Tevin Gigabyte G5 KD. Happened to get it on sale though.
@TheLonelyMoon
2 жыл бұрын
I was thinking, hey you know, PCs with 3090 are probably more expensive, the marketing is off by a long shot but it doesn't sound half bad since it should be a cheaper opt--- "Starting price: from $1999" "M1 Ultra: from $3999" Aight imma head out
@randomanimegalaxy6859
2 жыл бұрын
it's very much comparable with 3060 so you should also compare the price of 3060 with M1 which is way cheaper 3090 is just overkill for M1 ultra so as price too.
@TheLonelyMoon
2 жыл бұрын
@@randomanimegalaxy6859 Yeah, I thought it was gonna be priced around 1500 to 2000 for ultra, not 4000 👁👄👁
@kiro253
2 жыл бұрын
Haha do u really expect a cheap price from this company called apple? It literally only sell premium garbage
@HoodedMushroom
2 жыл бұрын
@@randomanimegalaxy6859 Right now, when iam looking at the prices at where iam from (EU), and its stock, i could buy full build with 3090ti and 12900k with DDR5 for 4k dollars. Apples pricing is just ridiculous.
@Nyaruko166
2 жыл бұрын
@@HoodedMushroom they sell brand not product 🐸☕
@markdalbey
2 жыл бұрын
The graph says Apple uses 200 watts less than the RTX 3090. So? I live in Las Vegas. The cost difference in electricity per year at 8 hours per day 365 days a year would be 70.95. In Hawaii which has the highest electrical price in the industrialized world, you would save 221.45. While significant to me, I am not spending 4,000 on a computer, or 2,200 on a graphics card. That cost of electricity probably is not a factor if you are in the industrialized world. It might be a factor in the Solomon Islands, which has the highest electricity rate in the world. The difference would be 578.16. If you live anywhere other than the Solomon Islands, I don't see the electricity savings being a factor in your decision making. As far as fan noise. My a/c is a lot louder than any computer fan. I just wear noise canceling headphones. They make everything really quiet, and I get to listen to B.B. King as a bonus.
@tjcarr70
2 жыл бұрын
What you should consider is where that power comes from. If clean and efficient source then no problem. If people thought more about the environment then this would be a major factor, especially if you come from a third world country like the uk where people cannot afford to put food on the table….
@RunForPeace-hk1cu
2 жыл бұрын
you are talking about ONE Mac Studio. Corporations buy them in data centers and all around in offices ... It's NOT about you. When Apple made that presentation, they were talking to corporate executives, not you 😂 There are companies buying a rack full of Mac Minis ... replace them with Mac Studios ... BAM ... money saved. You just don't get it because you are a simpleton.
@mikldude9376
2 жыл бұрын
@@tjcarr70 if you can’t afford to put food on the table then you won’t be worrying about high price computers will you.
@TrioLOLGamers
2 жыл бұрын
You have the same issue: a computer doesn't consume always that much watt, but only when working: if you do light tasks you will draw in some cases not a watt. That's why you see some shitty laggish windows/Linux laptops on Amazon that costs nothing but they last longer than your PC with a 3080. The reason why you buy a big power supply is not because it will drain all that watt but because there are some peaks and when they comes it will cause CPU/GPU THROTTLING and in some cases it it's cheap it can even die... Believe me. It happens and that can even damage your RAM or your SSD. That's why everyone said that Apple made a big mistake with the M1 ultra with the power supply: that's too close and not sealed from everything.
@timotmon
2 жыл бұрын
You nailed it Lauri. Apple has been trying to appeal to 3D artists since the M1 when they made it big news that Cinema 4D was optimized for the new chip. So to come out with a graph that misleading is a downright dirty play in my opinion. Nice work!
@Teluric2
2 жыл бұрын
You dont optimize for a chip .you write a native version for the chip. Same crap Apple does . Trying to fool people like everybody has the same IQ of an Apple fanboy
@1.1kSubChallengeWithoutAnyVid
2 жыл бұрын
Where are the 2 replies. I want to read them.
@jmun3688
2 жыл бұрын
@@1.1kSubChallengeWithoutAnyVid fr
@timotmon
2 жыл бұрын
@Garrus Vakarian He doesn't do game benchmarks, that's an entirely different thing. His channel focuses exclusively on creative production. So yeah, The m1 Ultra can't compete with a 3090 in GPU centric creative software on almost any level. Certainly can't compete in 3D rendering applications like redshift and octane plus it's not scalable.
@timotmon
2 жыл бұрын
@@dannybcreative Danny, I love Mac's and small scale 3D projects are perfectly fine on that platform. FCP is exceptionally optimized for the new M1 architecture plus the the low energy consumption is phenomenal. I have to say that most users would be very pleased and very impressed on the the new Studio Mac. No reason you can't create the highest quality of visual content on the new macs. The only exception would be if you were rendering very dense polygonal objects with high resolution bitmaps for 3D work , That's fine and PC's are exceptionally focused on performance like this but not quite as eloquent. I would love to drive an F1 formula car from time to time but I'd rather have a high end Mercedes as my daily driver.
@OrestisRovakis
2 жыл бұрын
in the power consumption you should consider the time that power will be drawn. If the rtx system needs 30minutes with 450w is the same as a system with 5 times more time to do the job with 5 times less power consumption which is close to what we saw here.
@lyte561
2 жыл бұрын
Doesn’t mean your game less if you get more fps
@piotrj333
2 жыл бұрын
There is 2 issues. First Nvidia already runs on edge of efficiency, basicly stock RTX ampere is like overclocked Ampere. I have 3070Ti and I didn't do any unstable tricks I basicly dropped TDP to 75% yet performance was still 96%. I also did all the way tests to 40% TDP, and found the highest efficiency per watt 3070Ti has at 55% TDP. And i know from Pugetsystems it is common thing you do while making rendering farms If you do more insane tricks like undervolting efficiency will be even better. 2nd. Ampere is old 8nm process and is more then 2 years old. RTX40xx serie is comming soon and will be much closer by date of release to M1 ultra then M1 ultra to RTX3090.
@theyoutuber273
2 жыл бұрын
No one buys a top-end desktop for efficiency. I can understand efficiency on a laptop where you have a battery, but it's non sense. Also, efficiency isn't linear with power. Meaning that even if you give m1 max same wattage as the rtx-3090, it won't have anywhere near the performance as rtx-3090.
2 жыл бұрын
and then there is the saying "time is money"
@Funcle
2 жыл бұрын
@@lyte561 nobody buys Apple for games
@nikhilpaleti3872
2 жыл бұрын
This needs to reach everyone. Apple literally has nothing but lies in their arsenal. Weaker than a 3050, but crying that they beat the 3090. Just like they lied that the M1 can beat an i9 while it would struggle to beat a Ryzen 5. No one cares about 10W power draw only when you get no performance, and need to give up compatibility with x86, and lose ALL upgradability for it
@Busy-B.
2 жыл бұрын
Not to mention if your power consumtion is 5x lower but you need 5x the time for the same tasks it comes out to nothing.
@RiceCubeTech
2 жыл бұрын
The issue is video editors vastly outnumber other professional tasks so it’s a loud majority who act like the chips are bleeding edge, because for normal NLE final cut and the m1 chips are extremely fast and efficient. But that’s mostly due to the TWO video decoder chips where most chips have one. It falsely leads “pros” of other tasks to think it’s somehow magically faster. But for any 3d rendering, you’d be better off with any Nvidia card.
@nikhilpaleti3872
2 жыл бұрын
@@RiceCubeTech Nah, it's just Mac fans that outweigh everyone's arguments. Even for Video Editing, remove the advantage that specific Apple applications take, they aren't as sky high either. It is stull much more on-par with 3060s or such, th an with a 3090. And of course, point of outside of Video Editing stands.
@darealduck6945
2 жыл бұрын
don't have a mac but: that 10w power draw is essential to macbook users and i have seen videos of how users love how the m1 sips power in comparison to the power hungry i9 macbook pros of yesteryear, and how the m1 outpaces the i9 laptops in video work and rendering
@Rave_Etherメ
2 жыл бұрын
let's not forget about the fact that rtx 3090 costs about $ 2100 when apple is about $ 4000, it would take a long time to recover this amount by saving apple's energy demand
@eneveasi
2 жыл бұрын
Apple seems to power throttle their machine rather than building actual thermal design into their product. Efficient performance at a low point in the relative benchmark curves. That ends up being useless for anyone with anything major to get done
@Byronic19134
2 жыл бұрын
I mean Apple clearly has figured out something about thermal design because they're most popular unit at the moment is the MacBook air that doesn't even have a fan.
@julian5857
2 жыл бұрын
@@Byronic19134 they just put a very efficient but not very strong chip in there
@PanosPitsi
2 жыл бұрын
It’s an arm chip it’s basically an iPhone chip on steroids
@PanosPitsi
2 жыл бұрын
@@julian5857 the average user won’t even be able to use it’s true potential but think that people who actually need to 3d render must use nvidia exclusively anyways because most 3d rendering apps are CUDA exclusive
@m.l.9385
2 жыл бұрын
@@julian5857 Yeah but the M1 in the Air actually can burn more Watts or in other words has actually a higher TDP than its former lousy 2-Core Intel counterpart - 30W instead of 13W - without having a fan anymore. So they did improve the thermal design for this model drastically. And even the new MacBook Pro 16” is thicker than its Intel counterpart. Apple chocked all the Intel CPU close to their thermal death. And yes they did improve thermal design in this generation. And it is actually the case that they rather let the new M1 Ultra throttle than ramp up the fans - because they want it “absolutely” noiseless.
@emberparadox458
2 жыл бұрын
I sat in on an Apple presentation to retailers and they oh so loved talking about how great the graphical performance was on the M1. However, when it came time for the Q&A I actually posed the question of how the M1 performs in comparison to Nvidia's RTX series since they went on and on about how great it was for gaming. Their response? The M1 is fantastic for games in Apple Arcade... and that's all they said about it. They immediately pivoted away from my question and refused to answer any more of them. Between how they are assembling their devices and what the actual capabilities are, Apple is turning into the snake oil salesman that is somehow managing to con millions upon millions of people into thinking their pc products are soooooo high quality and soooooooo much better than windows pc's that perform better at half the cost.
@theTechNotice
2 жыл бұрын
Makes you think, doesn't it? 🤔
@racistpixel1017
2 жыл бұрын
Apple know with m1 they dig them selves in to deep hole, amd steadily approaching m1 efficiency level and performance, intel have plans to and in terms of mn have tons of room to grow where apple can shrink transistors just so much more. I mean apple is in deep trouble, its just not obvious to average consumer yeat
@rubik1452
2 жыл бұрын
Seems like Apple wants make people spend a huge money for a processor that is lower than the cheaper one. What an amazing company.
@matthewgamman4303
2 жыл бұрын
yah because most of the people that buy their products would just go "oh, apple says its good, im gonna buy"
@rubik1452
2 жыл бұрын
@@matthewgamman4303 well i don't understand them dude maybe they just buy by the decision of the crowd
@justinlikesme19
2 жыл бұрын
@@matthewgamman4303 cuz the western countries has no problem to get a apple product and specially in japan and korea
@ChibiNaruto
2 жыл бұрын
Isn't that they have been doing all along? They make people pay for something that exceeds the price it should be while there are other products from other brands that work like a charm but cost lower, the price-performance ratio of Apple products were never good in my opinion
@Teluric2
Жыл бұрын
The teenager company"
@RenormalizedAdvait
2 жыл бұрын
84.5*AVG(489.13,580.29,510.95)% = 445 W, i.e. efficiency per watt is almost the same across Nvidia and Apple, the only problem is that Nvidia uses 8 nm while apple uses vastly superior 5 nm process. This makes Nvidia's GPU architecture light years ahead that of Apple.
@maximseryakov2373
2 жыл бұрын
What? Why is it ahead?
@mariuspuiu9555
2 жыл бұрын
@@maximseryakov2373 because it achieves similar perf/W using a much less efficient process node.
@Knightfire66
2 жыл бұрын
yes and imagine nvidea switches to 6nm or 4nm.. it will be about 200% faster. so it would be 800-1000% faster... w/fps AND $/fps is way better. apple should beat the little mini brother of rtx 3090 first... the 3050 is way better then m1. and still only 300$.
@andrewmicro16123
2 жыл бұрын
Another thing to consider is the M1 cannot simply scale up to match Nvidia's performance. There comes a point of diminishing returns which points to Nvidia's architecture just being simply much superior to Apple's.
@mariuspuiu9555
2 жыл бұрын
@@andrewmicro16123 it is afterall just a mobile architecture retrofitted to work in a workstation.
@omarspost
2 жыл бұрын
A professional's hourly rate dwarfs the electricity used by the mac. Common-sense to compare productivity rate vs time, not vs watts. This is not a laptop. Performance per watt is only a priority on portable devices due to battery capacity As fantastic as Apple products are, they sometimes unnecessarily shoot themselves in the foot for no reason.
@maharaj8460
2 жыл бұрын
They are like we consume less power and make 2db less sound. Like ok bud no one cares about that.
@ThinLineMedia
2 жыл бұрын
Apple always lie - not a big deal
@Prime_Rabbit
2 жыл бұрын
Apple lying about their products performance by an insainely large margin? I'm shocked. Shocked I say, SHOCKED!
@digit313
2 жыл бұрын
nvidia 8nm vs Apple 5nm not fair. Bring RTX 4x00 5nm vs Apple m1 in August 2022.
@gn1l262
2 жыл бұрын
@@digit313 apple is literally claiming them strong
@Tigerex966
2 жыл бұрын
You just saved a lot of people into 3d thousands of dollars A simple $1599 rtx 3050 build will outperform a $8000 Mac studio ultra 64 core beast build in 3 d.Thats just amazing. And will give it a good fight with Intel quicksync in video editing in many areas and 12900k CPU will beat it in single and multicore in Cinebench which measures all the threeads in multicore.
@breechcomet9724
2 жыл бұрын
wait even 3050 can destroy mac studio😮 so PCs still over better performance for price i guess compared to mac?🤔
@Tigerex966
2 жыл бұрын
@@breechcomet9724 apple is king of hype. that's a Mac studio ultra with 64 GPU cores, 64 GB ram. A PC with the 12900 k and an rtx 3050 pcie 4 SSD same ram in a case with good cooling will beat the Mac. It will lose some video editing where apples media engine encoders come into play, on m1 optimiser video editing using those encoders where it's a beast, but Intel will win a few with Intel quicksync. And win most 3 d and games. Software optimization the key for both. And newer twice as fast GPUs are not far away this fall. $2000 to $5000
@llothar68
2 жыл бұрын
@@hollyc5417 People into 3D follow the advise of Linux Tech Tipps 🤣
@super_hero2
2 жыл бұрын
Well you also have to talk about the package as a whole though. It is like 20x smaller than the PC tower.
@llothar68
2 жыл бұрын
@@super_hero2 And every office has still space for a 20x larger box. I get the idea for smallness on mobile things. But on others when we have space, big is beautiful. And it's always so much easier to repair big stuff than have the motoric skills to fiddle with small screws (yeah the screws to hold NMVe drives are insane ).
@reinhardtwilhelm5415
2 жыл бұрын
This is nothing new. Apple has been gaslighting consumers for almost their entire existence as a company, and it should come as no surprise that they’d sell you an “RTX 3090 killer” that loses to the RTX 3060 across the board. I could make a PC that beats this thing for $1500. That’s a disgrace.
@Crowdrender
2 жыл бұрын
The Benchmark that you could try running would involve encoding/decoding 4K and 8K video. You mentioned this yourself in the video that there are 'better' or 'more' encoders in the M1, this might be the area where Apple could claim better performance, but we'll await your benchmarks to say for sure!
@DC9V
2 жыл бұрын
3:30 Actually, it's 244% of the M1, meaning that it's (244% - 100%) = 144% faster.
@tanzeelsalfi
2 жыл бұрын
but why would you - it from apple it's just 244% time more powerful then M1 ultra
@tanzeelsalfi
2 жыл бұрын
then According to your logic m1 ultra is 0% powerful comparid to rtx 3090
@EHouseFreak
2 жыл бұрын
yea i think so too... and for the 3050 rtx -28% slower. Is wrong caculated if m1 is the 100%
@Sasoon2006
2 жыл бұрын
Exactly
@maxmakesfilms69
2 жыл бұрын
When you consider "Discrete GPU", they're likely comparing to Intel's GPU on CPU offerings, not Nvidia's RTX range.
@Bonekinz
2 жыл бұрын
Discrete GPU means a GPU separate from the CPU, and since Intel's arc isn't out yet if they're comparing it to integrated graphics they should note that.
@freztino
2 жыл бұрын
Cool comparison, but it annoyed me a bit how you kept saying the percentage difference wrong for the 3090. I.e. when the 3090 scored 460% you said it was 460% faster (which would be 360% faster), but when the 3060 scored 150%, you correctly said it was 50% faster. These early benchmark tests were also the reason I decided to make a Nvidia build instead of getting a Mac Studio to use for Blender, even though I’ve always been a Mac guy.
@ABDTalk1
2 жыл бұрын
That's true
@NCPhotography
2 жыл бұрын
Did you know walking is more energy efficient than driving a car? I guess I'm going to spend the next 6 hours walking 20 miles to work.
@theTechNotice
2 жыл бұрын
🤣
@jsullivan10
2 жыл бұрын
Lol 😂 , yeah apple 🍏 smh….And I’m an apple fan 👦
@Noobmaster._.69
2 жыл бұрын
Lol nice one
@Lazzerman42
2 жыл бұрын
So what Apple says without really saying it, is, at the SAME power consumption, the Apple is faster... but they word it in a way that we hear "twice the performance at half the power need".... tricky Apples as usual
@matthewjessey12
2 жыл бұрын
That graph was saying when you force the 3090 to use 200 less watts. It was comparing efficiency not overall performance.
@KnightHasen
2 жыл бұрын
Efficiency also needs to be compared to the operation time in rendering workflow. The 3090 at full tilt uses almost twice as much power to get the job done in less than half the time means that it's not only more efficient at getting a given job done, but also frees up the 3090 for another job before the M1 Ultra is half done.
@matthewjessey12
2 жыл бұрын
@@KnightHasen this is very true but they didn’t necessarily lie just only showed one side of the data. It happens constantly the OP just kinda misunderstood the intent. They didn’t outright lie. He just didn’t set the parameters to properly test their claim or really understand what they were trying to say
@raak23
2 жыл бұрын
Yes, this. People are standing in line to bash Apple for what literally every other commercial company does: manipulate data so that it looks in your favor. They're not lying, just trying to sell their product. I'm not a big fan of this tactic, however, virtually EVERY company does this. Bottom line is: choose the hardware that does the job for YOU. I personally don't mind sacrificing some performance, in exchange for some user experience.
@SylvainDuford
2 жыл бұрын
Apple is a marketing company first and foremost.
@akhil_1210
2 жыл бұрын
Dear apple, Windows isn't Android? Don't try to mess with that
@starshipupdates5217
2 жыл бұрын
m1 ultra has two major problems that no one is talking about, wattage cap and data transfer bottleneck. this even applies to m1 max. the m1 ultra has a data lane that isnt powerful enough to feed the gpu and cpu with data, this is the equivalent to giving an f1 car normal fuel. it will only perform at 40 - 50 percent usage. when the m1 ultra is completing a render or doing a demanding task you will see the gpu usage being under 50 percent. this is because the data cant get to the gpu fast enough. this is shown in the wattage performance, the gpus tdp on chip is 120watts and we can see the m1 ultra using 60 watts in some cases. this means the chip isn't being utilised to its full potential. if we consider watts per performance we can calculate GPU utilisation. 60watts will be equivalent to 50 percent usage of the gpu. data blocking is also a problem here, sometimes the gpu will drop to idle or under 10 percent usage (12watts) which hinders its ability to provide a good score in the tests given. essentially its like you packaging items on a slow conveyer belt, sometimes you will be sat there waiting for products to come through just like how m1 ultra is waiting for data to come through so it can be processed. in this video you failed to provide any information about clock speed, gpu percentage and cpu percentage which would help with analysis of scores. we also need to consider emulation of some of these tasks which decrease the performance by 30 percent on average and in some cases over 50 percent if the translation encoders have been pushed past 100 percent. if the gpu is at 50 percent usage and the emulation is a further bottleneck of 50 percent we would see a 100 percent decrease in performance over Rosetta 2 emulated benchmarks. m2 chip apple is aware of this issue but obviously wouldn't publicly announce it. m2 should solve the problems with not only better performance on each core but also with fixing the data channels so teh gpu and cpu can be used at 100 percent, not 50 percent.
@PanosPitsi
2 жыл бұрын
Ehh it’s a first gen product problematic and expensive. Remember how bad dlss was when it came out? I’m an apple user but I’d never pay more than 1k for a desktop that can’t even play a video game that isn’t league of legends. Rn apples desktop chips are only good for laptops and for value (ironically yes it’s true 😂)
@starshipupdates5217
2 жыл бұрын
@@PanosPitsi haha apples value for money is terrible, even if by some miracle Apple fixes the bottleneck and gets all games to run on macOS native, we still face problems such as no ray tracing, no dlss and the requirement of spending a lot of money to get Apple hardware. It’s now been confirmed that apples m2 chip isn’t really next gen at all. The only difference between m1 and m2 will be higher clocks and more cores. It’s not confirmed if the data bottleneck will be fixed yet so perhaps we could see a huge performance improvement even though we would still be on the same nm node (5nm)
@PanosPitsi
2 жыл бұрын
@@starshipupdates5217 I got a Mac mini for 650 euros on release and it hasn’t lagged since even though I’m running engineering software since, I was a pc guy but last year the prices were insane.
@PanosPitsi
2 жыл бұрын
@@starshipupdates5217 the point of macs is to get a computer that is work exclusive and a console if you want games, I got a Mac mini and a ps5 and it came around 1100 euros. At this price you can’t really complain especially during the pandemic
@PanosPitsi
2 жыл бұрын
@@starshipupdates5217 also you are looking at this the wrong way, Macs are specifically designed not to be used for gaming. And it makes sense look at these glass mouses tiny keyboards they didn’t even have high refresh rate until recently, there is more to a machine then the os and the chip.
@superyu1337
2 жыл бұрын
You shouldn't compare CUDA and Metal, they are entirely different things. Cuda is NVidia's GPU computing framework while Metal is a graphics API for rendering made by Apple. Metal is basically Apple's (proprietary) Vulkan.
@dsrbby
2 жыл бұрын
He is comparing what apple compared... Apple marketing is full of lies. 😵💫😵💫😵💫
@remkojacobs9309
2 жыл бұрын
Metal is probably the fastest way to do Gpu computing on Apple and Cuda is probably the fastest way to do gpu computing on Nvidia hardware. I'd say it's fair to compare them.
@superyu1337
2 жыл бұрын
@@remkojacobs9309 Metal and CUDA are fundamentally different though. OpenCL would be the equivalent for CUDA.
@remkojacobs9309
2 жыл бұрын
I'm not familiar with Metal but doesn't it include something like OpenCL? I assumed it did since what is a serious GPU API these days without compute?
@superyu1337
2 жыл бұрын
@@remkojacobs9309 As far as I know, it can compile C functions to Metal Shading Language to run it on the GPU. But the major focus for Metal is on rasterization graphics for games if im not wrong.
@hanswurstusbrachialus5213
2 жыл бұрын
Maybe you should have included the math to get a comparison for performance per Watt.
@Bob1997654321
2 жыл бұрын
Hey this is a nitpick, but you're overstating the performance of the Nvidia GPU's by quite a bit. For example at 6:03, when you have the Apple M1 as the baseline 100% you state that the RTX3090 is 489% faster than the M1 ultra on the Monster render. The difference is 389% not 489%. You do this a couple times for other things. Just something to be mindful of. Good video though!
@WaspMedia3D
2 жыл бұрын
The craziest part is that there are apple fanbois running around actually bragging about how the M1 can render faster than 3090 ... because they just prefer to believe what they are told instead of actually finding out for themselves. Video editing might be an area where the M1 ultra does well though.
@brian2590
2 жыл бұрын
M1's are great for low power setups. I use a cheap M1 air laptop to remote connect to workstations with NVIDIA GPU's. I can live minimal on solar energy with almost no monthly bills... NVIDIA based workstations are on company power bill... It's a win win all around. Use them both lol
@JortBasement
2 жыл бұрын
Apple didn't lie.. the chart literally says: "relative performance" over Power consumption (watts).. The 3090 is 5 times faster than the M1 yes, but also draws 5x the power.. Thats what the chart is displaying. Can it be misleading? Yes if you don't pay attention. This just seems like someone doesn't like Apple and tries to slander them while they used "Relative performance"... It says relative for a reason.
@darreno1450
2 жыл бұрын
The graph seems purposely misleading. IMO, that's just as bad as if they straight up lied.
@jasonsykes4199
2 жыл бұрын
@@darreno1450 < - This guy knows his shit.
@Ferdam
2 жыл бұрын
No big news then: Apple's M1 chip is awesome when it is used to power mobile devices/small form-factor. But obviously M1 still can't match hardware that is fully designed to deliver best performance possible, which in turn means you'll get huge power draw, higher temps, higher noise and also will be physically bigger in size.
@Real_MisterSir
2 жыл бұрын
Yup. Especially the noise level argument is so dumb. My calculator has better noise efficiency than a 3090! That's literally the argument..
@frostilver
2 жыл бұрын
@@Real_MisterSir SIR, you've got my W for you
@stookla4942
2 жыл бұрын
The point of the video was to show apple was straight up lying, though going to your points you have never actually owned a proper high end pc.
@Ferdam
2 жыл бұрын
@@stookla4942 never mentioned that apple's marketing is acceptable. I have owned a few high end PCs since 2008. What's your point, exactly?
@Freestyle80
2 жыл бұрын
@@Ferdam that you are anal about noise
@pompomaddons
2 жыл бұрын
You have to use the base 3090 by Nvidia, not partnerships, making this invalid.
@kvxtr
2 жыл бұрын
if you did get the baseline one The Difference are wayyy off 2-3x times more than M1 so its not even worth the efforts and results are very similar despite the patnership
@vilcsith
2 жыл бұрын
I don't have a Mac Studio, but I'd be lying if that presentation didn't make me excited atleast a bit.(After all, competition is amazing for the consumer.) I'm so angry that they straight up lied, imagine how the people who believed that graph must feel stumbling upon this video. Desktop ARM chips have a bright future, why not focus on the power draw instead of hyperfocusing on "WE'LL OUTPERFORM EVERYONE OUT OF THE BOX"? Why must Desktop ARM debut with a blatant lie?
@BlueBaron3x7
2 жыл бұрын
Apple is amazing selling junk for years for 1000's more.
@winstonthompson6210
2 жыл бұрын
The CPU for the M1 Ultra is basically 2 times faster than the M1 Max. However, this is not true for its GPU, it is being bottlenecked by a hardware limitation that apparently Apple wasn’t expecting, and the more GPU cores the worse the bottleneck resulting in less gain in performance. The graph Apple showed was for performance per watt, not for performance. Apple showed what it could do at a certain wattage without mentioning that the other computer can draw on even more power than at that point, meaning it can give even higher performance. Sometimes even only 25% of the the M1 Ultra’s GPU’s full capability is being used during testing. Apples graph showed it reaching 105 watts, which with linear scaling like the other M1 chips have would be able to draw 120 watts, but I haven’t seen any other tester get it to use more than 80+ watts, and this video just showed me 90+ watts. Supposedly with the 48 Core M1 Ultra u can get up to 1.7 times faster GPU performance, but no M1 Ultra gives basically 2.0 times the performance unless it’s the CPU. They say by optimizing for the tile memory apps can get better performance, but it’s extremely harder than just optimizing for Apple’s Arm chips and will take a long time assuming developers bother. So I wouldn’t expect a perfect fix (hardware fix) until the M2 Ultra.
@eneveasi
2 жыл бұрын
M2 will still fall short if I’m honest. I feel like they intentionally power throttle their machine to avoid overheating. Because the thermal design on the studio just does not have the same cooling capabilities as your standard decent build
@winstonthompson6210
2 жыл бұрын
I assume you mean the M2 Ultra, because the M1 doesn’t fall short, only the M1 Ultra. The fans in the M1 Ultra Mac Studio are currently overkill, they are considerably heavier than the ones in the M1 Max Mac Studio because Apple says the M1 Ultra requires a better cooling system. Yet, they don’t pass idle while the Ultra is at work and the device remains cool. So right now the ultra is not getting hot enough to go anywhere near bringing the thermal system to its knees. Even the M1 MacBooks with fans get hot after gaming for awhile and their fans can get pretty audible, there is no reason why the Mac Studio should be any different if it gets hot enough. So clearly thermals is not a problem.
@Teluric2
2 жыл бұрын
Whats the HW limitation?
@winstonthompson6210
2 жыл бұрын
The hardware limitation is outlined in the KZitem channel Max Tech’s video entitled ‘Mac Studio Review: What Apple DOESN’T want you to know..’
@PanosPitsi
2 жыл бұрын
@@eneveasi it’s an arm chip the base m1 doesn’t even need fans
@13dma1rz
2 жыл бұрын
Thanks for running the tests. I don't much care about the power consumption issue. I'm not running a mining operation. Apple always appeals to creatives but they really can't compete on the hardware side.
@melaniodanilosindayen9011
2 жыл бұрын
Just wait. Intel said the same thing.
@PaulStoffregen
2 жыл бұрын
For creatives, Apple does indeed compete on the hardware side. The Mac tested with 3D rendering benchmarks in this video isn't the machine for heavy weight 3D rendering. That's the expensive cheese grater, which can run 4 W6800X or 2 W6900X GPUs. Yeah, it costs $20K. Maybe you could somehow cram 4 RTX3090s into a PC for less, but the consumer GeForce cards (even 3090) have less memory and some limits imposed by nVidia's drivers to force pros to buy their much more expensive Quadro cards.
@PvtAnonymous
2 жыл бұрын
@@PaulStoffregen then who is the Mac Studio for? That's the only question that needs to be asked. Is it the replacement for the 27" iMac? If so, then great. Although it gets pretty expensive pretty quickly. If it's supposed to be a machine for pro-users and creatives, I don't really see how. They market it as an absolute powerhouse ("Empower Station"), but it fails the most basic tasks compared to x86 hardware. Some people even returned the Studio because of performance issues, saying that they expected the Studio to perform better than the 16" MBP, which it didn't. So I am still confused about the Studio.
@PaulStoffregen
2 жыл бұрын
@@PvtAnonymous My guess is it's mostly meant for people who use Logic & Final Cut.
@Saeed89
2 жыл бұрын
RTX 3090 is an ABSOLUTE BEAST and has the definite power to annihilate M1, 2, 3 or whatever crap Apple has to offer.
@KaoukabiJaouad
2 жыл бұрын
great review on point, putting the RTX 3050 was smart, it kinda helps people settle down, the M1 chips are insane, but their native performance is on low wattage, on laptops its revolutionary, on Desktops, it's pretty stupid, even on your electricity bill for professional use you won't see much of difference between an M1 Ultra top-spec Desktop and an RTX3090 PC with 12900K just because 99 percent of the time you use it on low-performance mode, the intel will pull 40W from the wall in that mode, the whole PC probably 150W max, it's when you use it at its full performance then you see the big difference in power draw(300 to 400W on average), it's not like a 3D artist is rendering all the time.
@Teluric2
2 жыл бұрын
Engineers runs models in their desktops for days and weeks nonstop using ansys and FEA
@KaoukabiJaouad
2 жыл бұрын
@@Teluric2 that is something else that is a company with people hired for that, if it is one guy let say a Goerge Hotz kinda of level of machine learning models, even him he is one of a kind, top 0.1 per cent of people in top intelligence on 5 generations, he doesn't run training models for a long time, when he's serious, he has his company computer farm doing that kind of stuff .. at this point you're just splitting hairs
@DigitalJedi
2 жыл бұрын
@@Teluric2 In my experience those types of workloads are almost always given to a company render / compute farm. You don't want the engineer(s) to be unable to do much work for days while their desktops are doing a simulation, when a compute farm would do it within the day in most cases and let them get on to their next thing.
@murutattack
2 жыл бұрын
So basicly they charged $1000 more because of the speech. 😹
@SafetyFooT
2 жыл бұрын
Any of my professional audio colleagues who have "upgraded" to an m1 machine have been dealing with compatibility nightmares.
@theTechNotice
2 жыл бұрын
Yep, I've heard a lot about that as well.
@BabuMosahi
2 жыл бұрын
Their marketing department is a kind of beast that even they will sell expensive products in the name of privacy , but anyone who knows about apples "ACTUAL" Privacy policy, they say that they will collect your data but will not make any profit out it but they will share your data to their partners and their own third parties , the third parties 3 even allowed to collect your data but they use a loophole to collect your data , which is not to be disscus here , so that they technically and lawfully cannot get sued .
@fLaMePr0oF
2 жыл бұрын
Yes in the real-world performance scores a PC+GPU is WAY faster than an M1 Ultra system, however, If you divide the benchmark scores by the watts consumed by each system then the M1 Ultra system is giving slightly greater performance per watt in the Monster and Classroom renders and slightly lower in Junkshop (i.e. divide M1 results by 84 and PC results by 452). This is what apple mean on their graphs as they are clearly charting performance vs power consumption Yes, it's a ridiculous comparison which bears no relation to actual system performance - the results for such a calculation are always gonna be very similar since you're basically just calculating the power efficiency of the silicon and they are both built around similar processes, and yes Apple deserve to be called out for such hokey and misleading 'benchmarks' but they are not actually lying
@einz600
2 жыл бұрын
No, sorry. They literally lied: "M1 delivers faster performance than the highest end GPU available while using 200 W less power."
@Bollalillo
2 жыл бұрын
i've never understood why people buy anything assosiated with apple
@1235-g7r
2 жыл бұрын
apple is a total clown 😂 go back to making phones
@phoenixyt124
2 жыл бұрын
I love how the 3050 is faster than the m1 ultra
@peterfuentes5893
2 жыл бұрын
As a professional video editor, I’ll never understand why people in my field of work insist on getting overpriced apple products.
@powerhouse884
2 жыл бұрын
Cuz Windows suck and is more problematic to work with. I have a Pc with a 3080 and i still find macOS better to work with.
@peterfuentes5893
2 жыл бұрын
@@powerhouse884 I haven't had any problems with Windows. I will say that Mac OS is a bit easier and more intuitive to use but it's not worth the premium price tag in my opinion.
@samgao
2 жыл бұрын
I have a 5950x and a FE 3090 for gaming, and an M1 ipad pro for video editing. Best combo. Both cost over 2k. (PC cost over 6K)
@TheAacharge
2 жыл бұрын
ARM can help M1 beat x86, but can't help on GPU, since GPU doesn't use x86. Basically, the total number of GPU transistors determines the major performance of the GPU.
@UhOhUmm
2 жыл бұрын
But it can't even beat x86. Sure it's fast at a low tdp, but x86 pushes performance way above m1. ARM is good for laptops, that's about it.
@FujinBlackheart
2 жыл бұрын
Apple made a really good processer there in what it can do with less, but if it comes to raw power and even less price you loose out, and even more if you ever dream of updateing your machine, also lets not talk about their dodgy costumer service.
@theTechNotice
2 жыл бұрын
Here's my testing on the performance/watt, let's see who wins in that category? 👉 kzitem.info/news/bejne/y3mpmGuZh6xnkqA 👈
@Tigerex966
2 жыл бұрын
But it's not except per watt.
@audiovid.
2 жыл бұрын
Man, you are manipulating the information, when catched on this instead of saying "sorry" guys, I am also not independent tester - you say this. Unsubbed... I do not like Apples products, but even more - dihonest guys.
@blueben1224
2 жыл бұрын
Extrapolation goes with caution, risk, and disaster.
@andrewmicro16123
2 жыл бұрын
Performance can't be just extended like that. Think about it, in a desktop machine if Apple could have simply upped the power to beat the 3090 they would have done so. Power and performance don't scale linearly. Plus the 3090 runs on a worse process then the M1. All these factors point to the Nvidia architecture being much superior in terms of efficiency and scalability.
@jakejoyride
2 жыл бұрын
Where is fps comparison in Cyberpunk?
@MARKXHWANG
Жыл бұрын
by messing with nvidia, apple lost gaming. now they will lost any AI opportunity...
@ChibiTheEdgehog
2 жыл бұрын
I mean come on, don't tell me you have ever actually bought into any apple marketing. They are the kings of alternative facts
@skildfrix
2 жыл бұрын
Apple didn't know that their charts were placed upside down
@daves.software
2 жыл бұрын
FYI, 489% as fast is not the same as 489% faster. If the scores were identical, they would both be 100%, and you wouldn't say that one was 100% faster than the other. So if you're going to say "x faster" you have to subtract 100%.
@HiVikas
2 жыл бұрын
people who are willing to spend the money for apple m1 ultra will not even think about the power consumption, they are only concerned about performance.
@MichaeltheORIGINAL1
2 жыл бұрын
Wait, so you are telling me that Apple is misleading their customers? NO WAY!
@kvxtr
2 жыл бұрын
Dint see that coming? Cant wait to see what graph apple comes next time
@Manu-xb4rl
2 жыл бұрын
So basically this exact M1 costs 5750€ with the lowest specs and is still nowhere near the RTX 3090 whilst they advertise it like that WOW
@svtcontour
2 жыл бұрын
Also not to mention if something is using a far lower wattage, but also taking far longer to complete the job, then has it really consumed that much less power? Probably not that much of a savings any more.
@infinite_ender638
2 жыл бұрын
I like how even though it was 1/5 the performance of the 3090, it drew more than 1/5 the power than the 3090 was
@schrodingerscat1863
2 жыл бұрын
It's worse than that because the desktop machine has a desktop CPU also drawing a ton of power. The M1 is essentially an SOC kind of like souped up iPad so looking at the power drawn from the wall isn't really telling the whole story. The thing that is impressive on the M1 is the CPU side, that is screaming fast considering the power consumption.
@MrPablosek
2 жыл бұрын
And let's not forget that the 3090 can be undervolted very well. I'm talking about 30% less power consumption while losing only a mere 5% of performance. It's insane and the 3090 wins even in terms of power consumption if you compare it like that to the m1.
@VapidSlug
2 жыл бұрын
@@MrPablosek Apple people have no idea what you are talking about
@MrPablosek
2 жыл бұрын
@@VapidSlug True
@MrCarsong123
2 жыл бұрын
quick note for clarity, it would be 150% faster not 250% faster/better
@jimmyvandoran7200
2 жыл бұрын
I’m guessing what they really Wanted to say was “if you compared both at the same power draw. This would be the performance “ but that’s like apple to oranges because apple isn’t designed to run at that much power. They prolly did some calculations to get some estimates. I hate apple.
@Ricky123abc
2 жыл бұрын
Bro, gpu2 performing at 105% the performance of gpu1 means gpu2 is 5% faster, not 105% faster.
@vibonacci
2 жыл бұрын
Imagine being the executive that forced Apple into single chip design after Intel 13th gen and Nvidias 4xxx series release.....
@wric01
2 жыл бұрын
Add to that Crapple is closing storage /repair options, you are stuck with it or pay for network storage.
@junilog
2 жыл бұрын
They didn't specifically said RTX 3090, just the vaguely written highest-end discrete GPU. Misleading marketing is always the way big corporations do their presentations. But these aren't exactly the perfect software to test out too, Vray is better done using CPU and 3Dmark should've been in the place of Geekbench. Blender is right though, but the same can be said about it using CPU instead of GPU like Vray. Oh and noise + temperature is the biggest factor when using dozens of these two systems in one single office. You'd 100% want less noise and especially the temperature, the air conditioning used to cool down the room is a much bigger concern than the power draw. Air conditioners working on 100 Mac studios are nowhere close to 100 3090 systems. I don't know why this wasn't mentioned more.
@digitalworld1384
2 жыл бұрын
Keep up the good work, that 8K looks really nice.
@Tamago.
2 жыл бұрын
Yes, they lied about using less power and better performance. But if you're talking about power to performance, they are about the same.
@leevfx
Жыл бұрын
a little late to this but I think you're misunderstanding the graph. It's saying IF you were to match the performance of these two machines, the M1 would be consuming 200w less power. The constant here is the performance. It's essentially underclocking the 3090 relative to the M1 and then looking at how they compare on power consumption. It's a misleading graph because its making you think its a more capable machine, when its really just more efficient, and as you stated, it's not as if you can just inject more watts of power to gain performance.
@lol-di3tf
Жыл бұрын
Well, at the beginning, the performance was NOT optimized because of many software don't really support M1 Ultra. But now, it performs just like RTX 3090. The only problem is that 3D softwares are still not interested in supporting Apple Silicon.
@cemsengul16
2 жыл бұрын
Only gullible Apple cult members believed the marketing claim that it was stronger than a PC with a 3090. I straight up laughed when I heard the lie during the keynote and said in your dreams pal.
@dbkgravity
2 жыл бұрын
In German there is the saying one compares "apples with pears ."
@dustinkrejci6142
2 жыл бұрын
This video: Gunfire on the battlefield..... ME: anyone hear boss music? Also me: bro Why are you dancing johnny? And why are you chief are you suddenly working out?
@Mu7eD-Stream
2 жыл бұрын
I have a laptop which is an i7 11800H and a 3060Ti. It features a 200w power brick. The GPU can pull a max 115w and the cpu 70. So when you need performance/watt against apple, mobile platforms would be more relevant. Based on the scores here this system will at all times outperform the M1 already in terms of score. As you compare Apples and Oranges (sorry couldn't help myself). PC and SoC systems are very different in their internal architecture a few examples are, PC's have interchangeable parts, apples and laptops general don't, Apple devices frequently use an external power supply unlike desktop computers(this can be beneficial for multiple reasons with the main being the PSU doesn't add extra heat), Apple use an almost complete SoC (system on a chip (GPU, CPU, Hardware Codec's, Internal Bus Control, Cache memory and more, only the RAM(plans have already been made to include this), Sound(for quality an external processing chip is required, though it does have one embedded on the HDMI function of the display card, so it could survive without) , networking (with ARM advances its just a matter of time until this joins) and maybe some ports(these will use PCI-x lanes) are really external to the main chip)misunderstanding I would love to see how it compared to a top end Lenovo or Razer laptop on both efficiency and price and a more uniform approach for testing purposes. A SFF(small form factor) chassis that the Mac has here is not a comparison for a M-ATX case as the thermal designs are very different. Using a MacBook Pro with M1 Pro would be best. Apple currently is far ahead with embedded video technology for SoC at low power thanks to the iPhone, then with ARM and AMD following closely behind, thanks to Android combined with mobile and console gaming platform advances. Intel is still in the process of miniaturizing everything and developing its SoC environment. Intel has tried to enter the mobile platform multiple times before and failed, so it has been more cautious during the transitional period than other manufacturers. Kind of like when Samsung was LCD/LED and LG had OLED, Samsung too had been burnt by trying to bring the tech to early. In many respects with Apple keeping the majority of its devices small and low power it has actually already joined the next generation of computing devices and being benchmarked against today's power hungry hardware is pointless and really just gives Apple bragging rights. Now I do not believe a 3090 system with i9 running flat-out is only using 400-500w. I would put this estimate closer to 900w. The PSU will have 'saved' power in its capacitors which will smooth out a lot of that demand when only running short benchmarks. Then we also have to factor in most people only have at most an 80% efficient power supply, which in this instance is important as the Mac uses less power then the PSU on the 3090 system wastes converting from 110/220v to 12v,5v,3.3v. Now for the finances, to run a 900w PC for 8 hours at 100% load would cost £1557/year based on a unit cost of 20p/kWh to run a Mac at 95w for 8 hours at 100% load would cost £166/year based on a unit cost of 20p/kWh. In just 2 years the money saved from using the Mac over the PC you could actually buy the PC, this also negates the higher cost price of the Mac as it can justify the power savings warrant the higher purchase tag. I am actually a windows guy, I hate the apple software environment, as an advanced used I find it highly restrictive. We must however give credit for advances where they are due and not misunderstand the intention of the marketing. What Apple wanted to highlight is that this is the most powerful SoC chip ever produced and sits in league with systems which have discrete(very important definition) hardware for both functions. They were trying to tell their users this is the future of computing, not we are faster than anything available currently. Like any producer it chose the evidence which highlighted its product in the best possible light. Whilst it may be a little misleading, the small print was not, we knew the environment, software used and how the results were gained.
@TheSnazzed
2 жыл бұрын
The trick is the vertical axis. The vertical axis isn't "Cinebench Performance" or even "Performance"... it's "Relative Performance." Relative to what? This graph may be accurate in some very specific situation... Maybe the M1 is better than the 3090, per watt, when compared to... something. Dollars spent? Airflow maybe? Perhaps the Studio Mac needs lower airflow to stay cool, so performance relative to airflow is far better in the M1? Who knows. It could be anything. What was the reference, or baseline? They both get better than 100... So what was their baseline system, that sets the standard? performance relative to airflow, per watt... vs a Nintendo Switch? Who knows. Yes, my example is ridiculous, but I made it ridiculous to make a point. The natural assumption is that "Relative Performance" means relative to each other. The M1 performance in relation to the 3090 performance... but it doesn't HAVE to be. In fact it probably isn't, because the graph IS the relationship. "Relative" doesn't need to be said... unless it's relative to something else. If I'm charting the acceleration of a car and the vertical Y axis is speed, and the horizontal X axis is time, I label the Y axis "speed," or MPH or KPH. I DON'T label it "Relative Speed," unless it's the speed related to something else! I'm 98% sure Apple isn't lying. I'm 98% sure they don't want to be brought up on False Advertising charges. It's some arbitrary measure of performance, against an unknown reference. I'm 98% sure they have something they can point to and say, "no, look, it IS true... from a certain point of view."
@FuSiionCraft
2 жыл бұрын
Hmmmm.... Did you ever see Apple NOT lying ?
@harishankarsingh1779
2 жыл бұрын
Apple was using bs benchmark, the most trusted benchmark by Apple. 🤫🤣
@MegaKauti
2 жыл бұрын
Im pretty sure I saw a comparison between a calculator and a super computer 😂😂😂
@ayushpawar8952
2 жыл бұрын
so who is gonna sue apple for wrong advertisement of products?
@xora2065
2 жыл бұрын
Some Apple fanboy in tears: It's freaking apple, they can't lie
@anupew3276
2 жыл бұрын
Most interesting part is that apple is actually not even more power efficient in Blender and V-Ray... Takes 5 times less power for 5 times less performance so same performance per Watt as NVidia that for a long time is notorious for performance over efficiency approach, especially in high-end and Intel is basically the same. I expect it would be even worse for Apple when compared to AMD system as they have high power with way better power efficiency than both Nvidia in GPU and Intel in CPU.
@Agent_48_TN
2 жыл бұрын
Whatever Apple makes im still a windows & Nvidia user no matter what
@wingxerox
2 жыл бұрын
This is like power to weight ratio in a car but it's power to computing ratio. Still scummy to market it this way though.
@hyperlaps3
2 жыл бұрын
having a custom build PC:you can upgrade any parts you like within your budget. apple: you can buy additional 16 gb ram for $1,999 HAHA!
@ΧρήστοςΓάλλος-ω2ψ
2 жыл бұрын
Mate the apple graph is not a lying it's just that the 3090 can go all the way up to 400w while the graph only shows the 300w which is less performance than real life scenarios!!!
@cis9222
2 жыл бұрын
The graph is not a lie and you know that, apple was just showing the graph up until the tdp of the m1 chip. If you run the windows hardware at that maximum power output... What will the performance be?
Пікірлер: 2 М.