I hope you enjoyed this video. It's a favorite of mine. Also. Yeah, I know I have over 100K subs now. I recorded this video 2 months ago, back when it seemed so far away. Check out other technology reviews here: kzitem.info/door/PLKtxx9TnH76RiptUQ22iDGxNewdxjI6Xh
@masternobody1896
2 жыл бұрын
I wish I was genius so I can make chips faster
@masternobody1896
2 жыл бұрын
rip gamiing fps stuck at 240fps
@Crunch_dGH
2 жыл бұрын
Very helpful, thank you very much!
@Shackkobe
2 жыл бұрын
Excellent content. But...I'm pretty sure you have already achieved this: kzitem.info/news/bejne/06qXx4OagKd-dpgm54s
@gazzy01
2 жыл бұрын
Thanks for the great content it feels nostalgic. I studied everything in Electronics and Communications Engineering(ECE) bachelor degree I'm still having knowledge of EDA, Virtuoso, Verilog, VHDL, Chip Designing and Embedded systems. In India, there are quite fewer opportunities for graduates most of the companies require post-graduation(masters).(Nvidia,Synopsys,Intel,Siemens,Qualcomm and STMicroelectronics). Because of some financial conditions, I studied Computer Science during my final year and changed my whole stream to get a job in a product based company. My Batch had 60 Students and 40 among them joined a CS/IT product based company during campus placements. But I don't regret my decisions as competition is quite high in both fields it helps me to improve but I hope one day I may work on the conjunction of both or pursue higher studies to become a design/verification engineer.
@ulwen
2 жыл бұрын
My father ran a product development consultant company for 30 years. He said that depending on the size/maturity of the customer the project budget would go from 80% design 20% testing for small/new customers to 20% design 80% testing for large/mature customers. Seems like chip design is no different.
@TheVlad33
2 жыл бұрын
When chips get bigger it’s almost always 80% verification. Too much money at stake
@meatybtz
2 жыл бұрын
Reusable "code" was a revolution in programming that lead to growth over time that was incredible. The same held true with hardware and programmable logic arrays. As someone who "did it from scratch" every time the advent of FPGAs was a thing. Our mainboards could do new things but there was a development overhead to reinventing the wheel with them. Being able to buy logic was, a time saver. Same thing with the development of automated trace routing. Went from our king artist who could actually lay out traces manually in the program and optimize in his head to being able to click a button and populate a board. I had not thought about any of that in almost 25 years now. Sheesh.
@yum33333
2 жыл бұрын
"Automatic trace routing".... good lord, does anyone actually use this stuff? Even the high end ones are just so horrendous.
@monad_tcp
2 жыл бұрын
You needs compilers, not only reuse, but automation. Of course VHDL is kind of the same idea we programmers had with Fortran in the 1940s. The problem is that VHDL is horribly dated, we made much progress with formal proofs in programming languages, they need to catch up. Actually, my PhD thesis on computing science will be about compiler hardware synthesis, but totally automated.
@monad_tcp
2 жыл бұрын
@@yum33333 yes, when you have a billion transistors, you use it.
@monad_tcp
2 жыл бұрын
@@yum33333 also, whats "horrendous", if its ugly, but works, who cares ? does it respect the ERC ? then that's enough. assembly programmers used to say the same thing about machine generated machine code...
@monad_tcp
2 жыл бұрын
But I know routing could be better, but that's usually a P-Space problem in computing, its really hard to solve it very efficiently. But it doesn't matter much, you aren't usually constrained by area in silicon, not the same way as you are in a PCB. So, that's not a problem for in-chip routing. Imagine routing a 10cm PCB but now in 1m of space. There's plenty of space on die.
@kensgold
2 жыл бұрын
Keep in mind that in the post silicon verification process we have to test the silicon across manufacturing process, temperature, and operating voltage( or as we call it PVT) we support. the production wafers we get back from the fab, have some variance in the parameters used to make them. so during the verification process we get "split" silicon or silicon made using the extreme edge cases of those parameters to ensure that the chip functions even at those extremes. here's a good example of how this gets out of hand fast. imagine your company makes a blue tooth transmitter chip, and you want to make sure you meet the bluetooth FCC standards. you have to test a minimum of 3 pieces of silicon from each process corner. you have to test your minimum voltage, nominal voltage, max voltage. all across your temperature range lets just use -40c, -20c, 25c, 60c, 80c. all to test your spectrum emissions across all 40 broadcast channels. So in summary, 3 chips, 4 corners, 3 voltages, 5 temperatures, 40 channels for a total test count of 7200 test cases. That is for a single functional test. shit gets wild
@waynemorellini2110
11 ай бұрын
I'm curious. How difficult is it to set up an automated test for those? When you talk about the manufacturers cutting into the parameters. Do they cut the safety margin to the bones now. I know an old chip designer that got 10x advantages back in the day by seeing how close he could cut into the safety margin on hand designed chips and tiling software.
@kensgold
11 ай бұрын
@@waynemorellini2110 the parameters i was referring to are the actual transistors performance metrics. How much charge needs to build up before they flip, how fast that charge can build up, and once flipped how fast the charge drains. All of those characteristics vary within a window in a normal non process controlled production wafer. As for automation you basically have to automate everything. We have entire engineering teams dedicated to making our test infrastructure, and test cases. not to mention running the testing.
@waynemorellini2110
11 ай бұрын
@@kensgold Thanks. I was wondering how much time goes into preparing the test code and rig?
@kensgold
11 ай бұрын
@@waynemorellini2110 It very much depends on the company, and their validation strategy. Generally after tape out, and rtl freeze, there is pre production. This is where we simulate the chip on fpgas, so we can port our functional tests, and system tests to the new chip. During that time we design, and manufacture custom pcbs that we use to do validation testing. That can take anywhere from 2-3 months depending on how much changes between the old chip, and new chips. That time estimate is very dependent on the chip however. If its a generational change it will be longer than a simple rev A to rev B of the same chip.
@scottfranco1962
2 жыл бұрын
Good overview. I left verification just after the Verilog revolution came to pass. We used hardware simulators at that point, with accelerated simulation. Sea-of-fpga emulation was not yet popular. The bottom line was that the company I worked for (nameless, but huge trust me) was new to hardware chip design. They had lots of money to buy toys, but had the idea that the tool would solve all. Thus the expectation was that the new, more expensive tool would relive them from the need to write test cases. I enjoyed being the one employee who understood the tools, but rapidly realized that it was a falling rock I had volunteered to sit under. I left the company soon after.
@prioris55555
2 жыл бұрын
I worked for company (nameless but huge and begins with letter M) that verified hamburgers... :)
@spikester
2 жыл бұрын
@@prioris55555 Nameless but huge, when a company moves so much product that makes everything work that nobody really knows who you are by a brand name. Unacceptable in todays shareholder farms. They could had chosen a better name like Corsair as a flanker brand, as they failed to move retail products Ballistixally. LOL
@gamar1226
2 жыл бұрын
@@prioris55555 Melanox?
@ttb1513
Жыл бұрын
After the Verilog revolution? That was a long time ago. I got into chip design after helping to create that Verilog revolution. It is truly incredible how massively the transistor budgets, or design sizes, have grown. This video does a good job. But it doesn’t convey just how incredibly complex the corner cases are for chip design as compared to SW. A SW programmer would start to get a sense of what it’s like when they go from single threaded to multi-threaded programming. The amount of pipelining and concurrency and speculative execution in hardware designs is enormous. This video also doesn’t mention formal verification. Those tools can do things like exhaustively tesingt a 64-bit adder while any brute force or constrained random testing cannot achieve the same in any of our lifetimes.
@scottfranco1962
Жыл бұрын
@@ttb1513 The last company I work at silicon design for, Seagate, was emblematic of the basic problems with ASICs. It was 1995 and they were new at ASIC design in house. And they weren't very good at it. It was random logic design before Verilog. An example of this was our timing verification. We had timing chains that were too long for the cycle time, and thus they simply alllowed for the fact that the signal would be sampled in the next clock cycle. Now if you were an ASIC designer back then, what I just said would have made you reach for the tums, if not a 911 call for cardiac arrest. Its an open invitation to metastability. And indeed, our AT&T fab guys were screaming at us to stop that. I got put in charge of hardware simulation for the design, and I have detailed this fiasco in these threads before, so won't go over it again. The bottom line was that ASIC process vendors were loosing trust in their customers to perform verification. The answer was that they included test chains in the designs that would automatically verify the designs at the silicon level. It mean that the silicon manufactured design would be verified, that is, defects on the chip would be verified regardless of what the design did. My boss, who with the freedom of time I can now certify was an idiot, was ecstatic over this new service. It was a gonna fixa all o' de problems don't ya know? I pointed out to him, pointlessly I might add, that our design could be total cow shit and still pass these tests with flying colors. It was like talking to a wall. In any case, the entire industry went that way. Designs are easier to verify now that the vast majority of designs are in Verilog. I moved on to software only, but I can happily say that there are some stunning software verification suites out there, and I am currently working on one, so here we are.
@JZL003
2 жыл бұрын
The random checks reminds me a lot of fuzzing, a common programing practice where you throw tons of random inputs at a program and see if it crashes. Typically in software you use something like American fuzzy loop, which actually looks at which code is hit during each run, and tries to tailor the new test cases to explore more of it. So as an example if you're fuzzing a jpg parser, even if you throw an initial random seed, it'll learn to create the jpg image header, and from there explore weird pixel values
@PaulSpades
2 жыл бұрын
software fuzzing comes from hardware testing.
@Blox117
2 жыл бұрын
so thats why video games are so buggy these days
@tissuepaper9962
2 жыл бұрын
@@Blox117 it's actually the reason they're only as buggy as they are. A million chimpanzees at teletypes will do things in your game that no human would ever think to do.
@imacds
2 жыл бұрын
@@Blox117 Game developers aren't given enough time to add all the features let alone fix most of the bugs (bug fixing can easily take up 60%+ of the dev time) because you can always just do both of these in post-launch patches and charge extra for the former as a result. Game projects also tend to have lower testability due to the intense time constraints, the fact that it really doesn't matter if they are buggy or fail as they are just entertainment software, and the fact they can be patched at a later date so there isn't any incentive to spend extra time getting it right now vs spending a little bit longer to patch it later.
@PamSesheta
2 жыл бұрын
Fuzzing is a less constrained input approach. More constraints are applied to limit “crazy” combinations to that of actions customers will actually take. For the keyboard example, it might be better to constrain the simultaneous key presses to a max of 10 and not allow for nearly all the keys to be pressed at once all the time in all of the tests. This lets you focus on things that are likely to happen in the course of normal use rather than finding obscure issues in corner cases that won’t happen. The config space for ASICs is wide enough that full fuzzing would be time intensive and difficult to debug. It would reveal probably more problems with the test environment than the design I’ve been interested in a fuzzing approach in verification but like I said, tracing back misbehavior is often harder and the scenarios may not even be possible on the bench. It is more valuable to find bugs that impact the major use cases and can be replicated on the bench
@wfjhDUI
2 жыл бұрын
It's horrifying and absolutely hilarious that even Intel engineers working on hardware are still mostly making regular ol' typo and "copy-paste" errors like us normies.
@abebuckingham8198
2 жыл бұрын
These are also the most common errors in advanced math textbooks too. I have a keen eye for errata so I notice when it's a 1 instead of an I. It's why I'm so fun at parties.
@TDRinfinity
2 жыл бұрын
If your conception is that Intel engineers are good programmers, I have some bad news for you. Hardware engineers in general are pretty poor programmers, most having mostly written C and perl code, without much or any experience with modern software engineering practices
@wfjhDUI
2 жыл бұрын
@@TDRinfinity They literally make the best optimizing compilers and performance tooling. Apparently they know a great deal about modern software engineering.
@Shrouded_reaper
2 жыл бұрын
Also consider there are likely malicious actors who are part of designing the chips for most big enterprises.
@slicer95
2 жыл бұрын
@@wfjhDUI The performance, compilers and tooling team are not the ones involved in Verification.
@diffore
2 жыл бұрын
As a full time verification engineer, I hate shortened chip development cycle (and modern capitalism in general tbh). This is causing an amazing stress on verification team due to limited time constraints (this and increased tapeout queues at fab - if you miss your time slot you would have to wait for another half of year) . Besides, there are times when throwing more team on the project will only cause more problems in a same way as throwing more cores on the processor will not yield expected N time productivity. I sometimes think about switching back to software jobs simply because verification usually ends up being an overtime mess.
@hoodoooperator.5197
2 жыл бұрын
Throwing more people at projects was what my former employer thought to be the best solution... In all cases. Needless to say, it didn't end well for them.
@ThylineTheGay
2 жыл бұрын
It’s so annoying how capitalism hinders growth, like rather than these companies all fighting and trying to screw each other over they could all work together to make the best product possible
@MrAngryCucaracha
2 жыл бұрын
@@ThylineTheGay i dont think you can find a more growth-oriented system than capitalism.
@TheDXPower
2 жыл бұрын
@@felixknight2278 If you actually look at historical sources (primary and secondary), their scientific complex was not efficient at all. Take their rocket program for example: In the Space Race the extremely bureaucratic nature of the Soviet's development agencies were so restricting and inefficient the top scientists in the various agencies made an "underground" board to handle matters for themselves rather, calling themselves the "Council of Chief Designers". For example, when discussing the practicality of reproducing the A-4 (technical name for the German V-2 rocket), the men in NII-1 had to report to the chiefs in the Ministry of Aviation Industries, who then proceeded to deny the military applications of rockets. The team at NII-1, in violation of their orders, still decided to document and analyze the A-4 rocket in secret. Had the team not defied instructions, the development of the R-1 would have taken significantly longer. Additionally, during this time, the three major agencies involved in rocket-related technologies (NII-1, OKB-SD, OKB-51) “worked independently of each other, despite the fact that … they were employed by the same ‘ministry,’ the People’s Commissariat of Aviation Industry.” This led to rampant conflicts-of-interest and duplicated work. Source: Challenge to Apollo - The Soviet Union and the Space Race, 1945-1974 by Asif A. Siddiqi (2000)
@rashidisw
2 жыл бұрын
Theres serious lack of Massive Death Count during non-war situation for Capitalism, but such list are surely exist for Communism.
@adamrashid4100
2 жыл бұрын
Thank you so much indeed. I can’t wait for these titles. I am starting a new job in a semi conductor company in January. 🤞
@HaroldR
2 жыл бұрын
Thank you for these amazing videos. The hardware manufacturing world is a beast of it's own and I love learning about it through your content.
@SamuelLanghorn
2 жыл бұрын
what do you do with the knowledge you acquire?
@catsspat
2 жыл бұрын
The key word here is, "art." I'm serious. Some decent (but still relatively small) percentage of people can, "learn to code," but only a very small number of them can become functional artists.
@apptouchtechnologies3722
2 жыл бұрын
Amen
@TheNefastor
2 жыл бұрын
Unfortunately, just like artists, our talent tends be recognized only after we're gone 😅
@qk.6535
2 жыл бұрын
Hello! I am a beginner in coding. I would like to get better so what do you mean exactly as ‘artist?’ Thanks!
@TheNefastor
2 жыл бұрын
@@qk.6535 basically, it's when you care about using elegant code instead of just "good enough" code. It's also when you come up with funky algorithms that can speed your code a thousandfold. Like FFT. When you see your data the way Neo sees the Matrix at the end of the movie, you'll know you've become a code artist and not just another StackOverflow leech 😄
@catsspat
2 жыл бұрын
@@qk.6535 This isn't a topic that could be described easily, as there are many aspects to it. For example, there are people who own 10x, or even 100x the workload of a typical code monkey, yet aren't stressed out, nor spend extra hours at work. They spend much of their time thinking about the problem, and proportionally *very small amount* of time actually coding, and practically no time debugging (at least their own code). Also, when new requirements come up (new features, specification changes, etc.), they take very little time updating their existing code to support them, because it's almost as if the code was written to support the new features. It's almost like magic, like an ability to predict the future, but it really isn't. One of the "art" aspect of chip verification is the ability to verify it for the actual functionality in real life, rather than against just some specification. For example, the specification might contain a big table of what the inputs and expected outputs are. Validating the hardware against the spec is not even a half step. Understanding the reasoning behind the entire architecture, and verifying for that is many classes beyond just standard verification.
@kevinbyrne4538
2 жыл бұрын
Recalls the HAL 9000 (from "2001") who confidently stated that any error made by a computer was ultimately due to human error.
@user-ty2uz4gb7v
2 жыл бұрын
GIGO
@andersjjensen
2 жыл бұрын
Intel shipped the original Pentum with a serious calculation bug. As in, you could trigger it in a simple spread sheet...
@FisherGrubb
2 жыл бұрын
Intel also had to keep a bug in their chips for generations due to it being a "feature" that was baked into Windows etc & fixing it would have made MS recompile a lot. Pretty sad...
@TheNefastor
2 жыл бұрын
Is that the IEEE 754 non-compliance ? IIRC their hardware floating point support was notoriously unusable in anything critical.
@MrGoatflakes
2 жыл бұрын
@@TheNefastor nah it's the Pentium division flaw. en.m.wikipedia.org/wiki/Pentium_FDIV_bug
@TheNefastor
2 жыл бұрын
@@MrGoatflakes that's exactly what I said. Faulty floating point support.
@alexanderphilip1809
2 жыл бұрын
I'll say this again. Yours is a criminally undersubscribed channel.
@R_Haas
2 жыл бұрын
Another great video. Your grasp of the subject and productivity in making these videos are a big inspiration to me. keep up the good work.
@Asianometry
2 жыл бұрын
*Tips fedora*
@Poptartsicles
2 жыл бұрын
Good video! This is why we're hearing more and more each day about massive hardware vulnerabilities. It's getting to the point where data centers should probably stay clear of new tech for the first few years until they can be sure they won't suffer massive breaches.
@FisherGrubb
2 жыл бұрын
I have a great quote that I believe is from a game designer at Nintendo: "It's better to release a game late than a bad one because a late one will be late until it's released, but a bad one will stay with bad reviews even if on time but crap". Kind of like how important verification is to strip as many bugs
@SianaGearz
2 жыл бұрын
Botched the wording a little bit, not to speak of the kind of vocabulary not used by lead representatives of a polite, child friendly company. Shigeru Miyamoto, quoted regarding N64 delay: "A delayed game is eventually good; but a rushed game is forever bad." Interestingly reviews don't even play into it, as they knew they had young loyal fans who would be buying the product without consulting any reviews. Disappointing them would burn this relationship forever, which is much worse than bad reviews. Remember, people didn't have Internet access back then. Magazines were around, but their reach was in tens of thousands copies while product reach was in millions for the same given region.
@DANTHETUBEMAN
2 жыл бұрын
Seems like a good place for a A I.
@unknownmaster4342
2 жыл бұрын
Tell that to Cyberpunk 2077 and GTA remastered trilogy.
@FutureBoyWonder
2 жыл бұрын
See this quote everywhere
@vzx
2 жыл бұрын
That assumes delayed production does not incur further expense. In reality, workers still have to be compensated for the extra weeks/ months they have to work on the delayed project.
@ronnycook3569
2 жыл бұрын
It's very, very much like software - most of the time is not spent on the original implementation, but on looking for problems with the implementation and fixing them before they affect deployment to a production environment. Except that it's much, much harder to patch, so all that work is front-loaded.
@demoniack81
2 жыл бұрын
Honestly that's only true if you suck at _originally implementing_ or you're working on old legacy software that's full of bugs already. Bugs happen obviously and they can be hard to solve, but if it's taking you more developer time to test than to actually implement things it means something's gone badly wrong. In a properly designed codebase 90% of bugs should be caught during implementation by the developer himself, and by the time you push to the test environment the application should be almost bug free. The only bugs that should be left are integration bugs that cannot be discovered untill all the other teams working on other systems have _also_ deployed to the test environment, and those should cause fatal errors right away and be easy to track down because you _should_ have extensive validation on all inputs and service responses. If you don't, you're asking for it.
@ronnycook3569
2 жыл бұрын
@@demoniack81 The key word in your reply is "should." Many of the things that "should" happen, in practice, don't. Even in the case of faultless coding, many bugs are a result of a poorly written or incorrectly interpreted specification. Or of bugs in the tools being used; the tools are themselves typically "legacy software." The only cases I've seen where debugging and maintenance did not consume the bulk of development time were cases where the developer either couldn't be bothered or was cut short. Counterexamples welcome.
@johnl.7754
2 жыл бұрын
Well one thing I can predict about the future from when he created this video is: That he got his 100,000+ subscribers
@Asianometry
2 жыл бұрын
Ha, when I made this video three months ago - I thought I would NEVER get to 100K ever.
@abeecee
2 жыл бұрын
this is so quality. When I get that first big embedded systems internship from my Asianometry industry knowledge, I'll be sure to sub to the patreon :)
@TheOnlyDamien
2 жыл бұрын
I love how when you were talking about how they relied on other peoples work as dependencies I instantly was like "That sounds a lot like our old pals NodeJS/NPM" and then you immediately went into a talk about the web design comparison lol. Have you ever thought of doing interviews with people in these kinds of fields? Could be cool to hear from those dealing with it directly interspersed throughout one of these! Obviously they can't get too detailed but just general personal input would be cool to hear from those suffering on the chip frontlines lol.
@Asianometry
2 жыл бұрын
I do have conversations with industry professionals on occasion, but mostly on background. They mostly ask not to be quoted.
@TheOnlyDamien
2 жыл бұрын
@@Asianometry Oh yeah that's totally fair. I was thinking it would probably be a nightmare to get any to actually come on and be frank about stuff. I can only imagine how hard your research portion is as it goes now! Love the videos man, I get hyped everytime I see a notification.
@smoofles
2 жыл бұрын
Interviewing people who have to deal with NodeJS and NPM (and web dev in general) will just get you hours of someone going between laughing and crying without a word uttered. 🙃 (And after the interview, they’ll immediately run off all enthusiastic about this new JS framework that they _need_ to rewrite their project with.)
@Olivia-W
2 жыл бұрын
_Cough_ Log4j _cough._
@TheOnlyDamien
2 жыл бұрын
@@Olivia-W How could you do this to me on this day of days, fucking Log4j. Ridiculous situation and I feel horrible for anyone dealing with it in a complex corporate environment.
@BooooClips
2 жыл бұрын
i literally have no clue what most the stuff you talk about in your videos but its so interesting
@mikhailryzhov9419
2 жыл бұрын
Besides FPGA, you can prototype on ASICs designed to simulate RTL, like Palladium. You can pretty good frequencies.
@ab76254
2 жыл бұрын
I don't have nearly enough understanding of chips to be able to meaningfully engage with this video, but very interesting nonetheless!
@kwgm8578
2 жыл бұрын
In the theoretical days object-oriented programming, ie, the methodology of building large programs using small, encapsulated, proven software modules, a pioneer in the field coined the term "software IC's" to describe what today are called objects, as computer engineers of that period were familiar with the concept of building systems with components in the hardware domain. It was interesting to hear you bring the concept full-circle at the close of this video.
@autohmae
2 жыл бұрын
That's amazing and makes so much sense. The difference obviously now is the scale of the software and hardware designs.
@monad_tcp
2 жыл бұрын
It amazed me how they still didn't introduce more formal verification, like we have in modern statically typed programming languages. There's an entire volume of computing science tricks that could be applied to electrical engineering. There's an entire PhD for me on that.
@kwgm8578
2 жыл бұрын
@@monad_tcp Go, Luiz! You'll discover if you're onto something rather quickly in an engineering PhD program. The real world is very unlike a KZitem comment thread -- no BS allowed!
@ttb1513
Жыл бұрын
@@kwgm8578No "BS" allowed? Only PhD’s? 😂. Or other type of BS?
@ttb1513
Жыл бұрын
@@monad_tcpFormal verification is used in chip design. As a trivial example, a 64-bit or 128-bit adder can be formally verified quickly to be 100% correct for all input combinations. Look up BDD’s for a starter. The controlled randomized testing or even doing exhaustive sweep testing at the unit or block level would never come close in our lifetimes. There is inherently just a huge amount of concurrency in a hardware design. And race conditions galore. A lot of people will take away that copy-paste errors are a common error and be astounded. Those are rare. I’ve never seen that. Intel’s ancient ‘divide’ bug was due to such an error: only part of a table of values was copied over. What’s much more common (and can be included as a typo) is using the wrong delayed version of a pipeline state bit. A design can work often, but if the timing of some event shifts by one clock cycle, the design can break. Possibly because of using "state_bit_d2" instead of "state_bit_d1". That’s really a logic error though, not a typo. It would be like in this video’s traffic light example. A possible error could be a race condition where the 2nd car arrives at just the right time as the 1st opposing cars exits the intersection. The light could fail to change for the 2nd car, getting stuck. It is a contrived example, but it amounts to the decision to change the light as car1 exits mistakenly looking at whether there was a car2 already there 1 clock cycle before it showed up, so that action decides no change is needed. And car2 arrives and sees, say, that a car1 is in the intersection and that it will trigger the light to change when it exits (which it mistakenly does not). This is contrived, but it is actually a better example of a bug. A bug where the design does not meet the spec. The video mentions a spec error, not a design bug: the spec failed to state the light must be fair when congested.
@josho6854
2 жыл бұрын
Excellent video! I think your comparison of chip design and web development is mostly valid. I used to work in the dram business, and respins were the rule rather than the exception. Analog circuits have infinite degrees of freedom, and are very difficult to simulate perfectly. First time right is much more achievable with a pure logic chip.
@Asianometry
2 жыл бұрын
DRAM
@meneldal
Жыл бұрын
I do disagree about one thing, is that in web development if the library doesn't work it's mostly your problem if the guy doesn't feel like fixing it, while if you buy an IP from ARM, Cadence or Synopsys and show it doesn't follow the spec they will fix it and apologize. Also I would also mention there are a lot fewer critical bugs that happen and they do clear notes about what was changed and how it can affect you. It is very unlike companies like Google that change stuff and break your code.
@kristianTV1974
2 жыл бұрын
I used to work for Altera (now Intel FPGA) in the UK Mid 2000's and visited ARM in Cambridge regularly as they used the largest Stratix devices of the time (EP1S80?) to do design verification and IIRC they used the largest equivalent Xilinx Virtex devices too. The thing with FPGA though, is that due to the reprogrammable nature of the fabric, it will never operate at the same speed (it's always slower) as dedicated silicon, so the verification would be very much functional rather than timing based.
@MrRosselhoff
2 жыл бұрын
First of all, great video, this channel is absolutely one of my favourites! This sounds a lot like the job of CSV in pharmaceutical manufacture. Although I admit it’s technically far behind the complexity of chip design, the principles are the same, we want to avoid bugs like the incident with the Therac-25 X-Ray machine which killed a bunch of patients undergoing cancer treatment due to an error in the code. We work to the GAMP 5 ISPE standards to instruct on the level of validation. To get over the problem of performing endless testing to cover every case we use a branch of maths called DOE, design of experiments, to calculate the best permutation of experiments to conduct. I was wondering if the constrained randomisation that you described utilises some DOE or would the maths become too complex to be practical.
@exponentmantissa5598
2 жыл бұрын
I worked in wireless design for years and one thing I can tell you about short design cycles is that although you get stuff out fast the chance for errors is of course increased. But here is the catch in the wireless arena you had at most 6 months to a year lead on competitors. If you screw up and miss with a design cycle then you have to respin and now your lead is gone. Do it again and now you are in your opponents rear view mirror. I have a couple of fantastic stories on what happens when you push too hard on the design cycle.
@kelvinnkat
2 жыл бұрын
The "I'd like to teach 100,000 subscribers one day" is some sort of running gag, right?
@jgaztelu
2 жыл бұрын
There is a big push in the industry to adopt formal verification, as it can greatly reduce the effort for some verification tasks. However, it's a totally different paradigm from the traditional UVM/constrained random flow and it can't fully replace it, at least for the moment.
@abuttandahalf
Жыл бұрын
How is formal verification different from what is currently done?
@adissentingopinion848
Жыл бұрын
@@abuttandahalf Basically there are mathematical proof finding algorithms already in use in academia. The gist is that the formal verification engine takes in "assertions" that you write that establish that "condition A must occur after condition B in X clock cycles", allowing individual line item requirements to be broken down into checks in the code. This is what is done in UVM and constrained random flow, but there are quite a few limitations and expansions that are far less hardware and more mathematical. Here's two statements we can use: p until q - if q does not happen,p holds forever p s_until q -q must eventually happen in the proof Leading to statements like (@ posedge clk) start ##1 !complete[*] ##1 complete |=> done ##0 eventually finished Which means "on each clock edge, if start condition is valid AND on the next clock cycle complete is NOT valid for an in indeterminate amount of clock cycles AND complete is valid on the next clock cycle, then on the next clock cycle done MUST be valid AND finished must be valid *at some point*". It's got the same level of density as regex, only now it's time based and you can overload the formal verification engine by implying testing conditions that explode exponentially. Yippee.
@leomethyst5045
2 жыл бұрын
It would be nice to hear your opinion on future of RISC-V or Arm Total Solutions for IoT. Great video as always.
@Asianometry
2 жыл бұрын
The story of RISC-V is still developing. I feel I wouldn't have much to say here.
@celeron800
2 жыл бұрын
What's with all the screenshots of PCB designs in an ASIC video :D . Anyway the DFT (design for testability) and DV (design verification) teams are 2 to 3 times the size of DE (design engineers) + PD (physical designers) and try and work in parallel with the design process with periodic feedback provided by the other teams. Also, as a PD you run multiple time consuming verification steps - timing, noise, DRC, ERC, LVS, LEC, ANT, power, fill checks etc. which factor into the ~70-60% verification time.
@Asianometry
2 жыл бұрын
Got to have something to show people
@rfengr00
2 жыл бұрын
Nice overview. There is also formal verification, which I believe is akin to a mathematical proof that the logic works as intended. The formal verification is needed as there are too many combinations to check. I was also surprised that first pass success was only 38%. I wonder if this is mostly for analog and RFIC?
@lubricustheslippery5028
2 жыл бұрын
I have a feeling that it can be harder to do the formal verification than the actual chip. And then it will also be a bigger risk for errors in the formal verification than the actual product. It must be so hard just to define exact how a chip should function.
@developandplay
2 жыл бұрын
As a software person dabbling with chip design I initially assumed by verification the hw people mean formal verification. I was so surprised to find out that verification for hw is just a different word for testing. Honestly having worked on a few open-source hw design projects the test coverage is often worse compared to sw projects. Even talking to a few industry folks it seems like hw verification is a lot more manual and inconsistent than software test cases. Now obviously the big chip makers can afford to hire huge teams of verification engineers but it looks pretty weird to me. For software projects pure test engineers are pretty rare as just writing tests all day long would be pretty demeaning to most software engineers.
@rfengr00
2 жыл бұрын
Let’s say you had something simple, like a counter with set, reset, and clock. Now make that counter 64-bits wide. It’s impossible to check the transition from every state to the next, for every input sequence; maybe take 100 years even in FPGA verification. With formal verification you can do it in a few seconds. Ha ha, don’t ask me how it works. I just know you mathematically prove it works, and it takes the same time as to verify a 4-bit counter.
@developandplay
2 жыл бұрын
@@rfengr00 Well with formal verification you do mathematical proofs on the structural properties of the circuit. While I do think this would have a lot of potential for both hardware and software engineering the reality is that it requires quite specific skills and has not really caught on for centuries. Aside from security critical applications that may require formal proofs like cryptography it seems like the effort is too big for broader adoption. However I do hope this changes as for hardware we should insist on correctness.
@someonespotatohmm9513
2 жыл бұрын
@@rfengr00 It becomes more complicated on more complex systems. Its not realy my field but if your code or harware has to handle a lot of edge cases it also becomes hard to describe them for formal proofs.
@GThu1
11 ай бұрын
Great presentation! As a very senior software developer, I found this very interesting. It looks like that the HDL language part is very close to the regular software development. Including the (very familiar) execution and emerging problems of the practical processes and testing/qa. I want't make a remark: you mentioned, "In the software world, it is easy to handle such design problems by issuing an update". Well, it's really not in most cases. Rolling out an update for a software with major design flow, when already released, causes lot of headaches and costs. Especially when the software is creating lot of data. Also with todays firmware based designs, you can handle lot of things with an update to the hardware firmware. The two kind of development coming closer and closer by time.
@Joemama555
2 жыл бұрын
15:55 lol "i would like to reach 100,000 subscribers some day". I guess that was recorded a while ago... :)
@Asianometry
2 жыл бұрын
Yeah my bad.
@fbkintanar
2 жыл бұрын
Interesting. I know that hardware verification has some links to formal methods in software design. I wonder how ideas from functional programming in software, like monads for managed (side-)effects, has any present or future impact on hardware verification.
@Nekroido
2 жыл бұрын
As a software engineer I'm excited to reinforce my understanding that hardware design is literally the same thing on abstract level. Thank you for this great insight of the chip development hurdles
@GoogleUser-ee8ro
2 жыл бұрын
Can I say, this trait of chip design (difficulty of verification scales up exponentially with complexity of chip) will fair better on ASIC than general purpose chip such as CPU or GPGPU, which has way more "corner cases" to cover and discover! It may suggest it will be increasingly uneconomic to design "general purpose" chips in the future,.......
@W1ldTangent
2 жыл бұрын
I think for a time we'll find using machine learning AI techniques mixed with traditional we'll find a lot of short term efficiencies but long-term it'll be rendered moot if we ever get a solid grasp on quantum computing. That's a ways off though, for now, best let the algos have a crack at it before we drive our small minds insane 😂
@awebuser5914
2 жыл бұрын
You need to stop using Eagle or other PCB design software images when talking about silicon EDA, it's easy enough to find examples of silicon design software! Just looks careless and kinda clueless...
@thosewhowish2b693
2 жыл бұрын
Man, 2:36 gave me pause. Is that Eclipse for RTL design? Can you explain what is going on there? Is that an existing toolchain or did someone hack together an environment with open source synthesis tools and etc.?
@nandakanda001wasabi
2 жыл бұрын
Remember having to explain to management how I mistakenly mirrored a design layout after several sleepless days designing it to get it done. Conclusion is that sleep deprivation affects concentration.
@anonyshinki
2 жыл бұрын
Constrained random verification, I thought I'd never hear the term again after defending my undergrad paper over ten years ago :D We were doing very basic stuff with old MIPS designs though, I was writing a Ruby DSL that generated a boatload of assembly tests for testing Verilog designs. But my academic advisor was doing similar work with the Elbrus processors (classified, of course, so we never saw any of it), so the various hardware verification related stuff that students and research assistants under him were doing might have been repurposed for that.
@Penultimeat
2 жыл бұрын
I work in the Test department at a FAB. We occasionally get these development lots. The software is precise and fallible.
@diggleboy
2 жыл бұрын
I really enjoy your videos, Jon. This video essay is a great take on the semiconductor design problem. I do believe that AI will gradually step in to help alleviate some of the challenges and issues with manufacturing silicon but the industry has been slow to adopt this technology to make vast improvements to silicon manufacturing quality. Some in technology thing "if it ain't broke don't fix it" and that needs to change in my opinion.
@vyor8837
2 жыл бұрын
AI isn't magic
@PKAdazGalaxiaz
2 жыл бұрын
The one suits all approach is hard. Taking time to make an ai might not be worth it and instead paying a team of 10 to do the testing comes cheaper. We have seen an increase in ai used for the architecture of chips but even then you spend years developing the parameters and code for the ai itself. AI shouldn't be seen as a way to put out chips at a faster rate, they screw up too. I think we just need to understand that all aspects of the chain need time to catch up, we have research projects that show we can shove more transistors into a smaller area but so what, if the machinery isn't caught up and only a few men are knowledgable enough to put it into practice it is all for nothing. Customers want to hear that the next product has a newer chip with faster speeds, they have grown accustomed to getting a new phone every year with some sort of improvement. This can only last for so long, the chain is bound to collapse at some point. wouldn't it be a waste to invest billions into a technology you know will only be relevant for a year or two. Any ai you make will lose its efficiency as soon as the chips get an upgrade and now you are left with a useless ai that needs to be changed to suit the new chip. People with proper teaching are just more cost efficient for a business.
@hoodoooperator.5197
2 жыл бұрын
Great video man. I've just finished my first term in MSc Digital Systems Engineering, it was so insanely hard, but I loved it. Particularly using cadence for layout. I seem to have a knack for layout 😎
@platin2148
2 жыл бұрын
Is it actually possible for intel to emulate there chips on a FPGA don’t think there are enough LUT‘s in a single FPGA.
@W1ldTangent
2 жыл бұрын
Obviously not, but they can simulate blocks of it in isolation up to a point, or wire multiple FPGAs together. At the end of the day, they still need to make sure all the pieces fit together, I honestly don't know how the Intels of the world get around that part without a leap of faith on a small run of engineering test samples.
@MisterFanwank
2 жыл бұрын
"OpEn StAnDaRdS" So is the C++ standard, and yet there's nothing actually open about it. You have to pay to get the official spec, you have to pay to contribute to the project, and the entire thing is massively bureaucratic. What is open about any of that?
@Asianometry
2 жыл бұрын
I don’t think open = free
@HolbrookStark
2 жыл бұрын
@@Asianometry then what's open about it?
@gebys4559
2 жыл бұрын
Great video. I personally hated doing verification on FPGA.
@12321dantheman
2 жыл бұрын
interesting. Just starting as a verification engineer and watching this to find out what it is I do
@HolbrookStark
2 жыл бұрын
You must be the latest new hire at Intel
@01sevensix
3 жыл бұрын
More outstanding work jon. thank you.
@luket2915
2 жыл бұрын
how did you even comment
@skazka3789
2 жыл бұрын
Why is this comment from 2 months ago when the video just came out a day ago lmao
@manhoosnick
2 жыл бұрын
Wtf
@lebimas
2 жыл бұрын
Quite a few of your recent videos have mentioned "I'd like to reach 100,000 subscribers someday..." but you have been past that mark for some time now, just letting you know. I assume these videos were made before you hit the mark.
@Asianometry
2 жыл бұрын
Yeah. 3 or so months ago
@Tential1
2 жыл бұрын
@@Asianometry was funny to hear, congrats
@apptouchtechnologies3722
2 жыл бұрын
Bugs are often zero days. My understanding is some of the x86 microcode isn’t fully tested….
@SianaGearz
2 жыл бұрын
x86 is a prefixed instruction set, so it has potentially millions of valid instructions, most of which are redundant, and more byte sequences that do SOMETHING in spite of not being reasonably expected to be an instruction; see "Sandsifter - breaking the x86 ISA". It is probably more or less tested for a few thousand instructions that compilers and assemblers will actually emit; but for one, nontrivial bugs do pop up from time to time, where an instruction performs correctly under some circumstances and incorrectly under others; and for other, data can become executable.
@ttb1513
Жыл бұрын
A better example for the traffic light 5:00 is if the later car shows up just +/-1ns from when 1st car clears the intersection. This could create a corner case where two "rare" events happening at just the right relative times cause the system to hang or screw up, like the light getting stuck and never changing for the 2nd car. The "fairness" example given was that of a spec being met, but the spec was incomplete. That is a problem. But just getting a design to meet a correct spec when there are tons of concurrent and interacting events going on is very difficult in itself, especially when some of those events are quite rare.
@ttb1513
Жыл бұрын
Also, relating to the keyboard testing example 13:40, this video did not mention doing unit tests of blocks of the design. At that level, it is much easier to control and sweep through interesting cases, often exhaustively. Something that usually cannot be done when the design block is exercised within the context of the entire chip. Additionally, formal verification tools are also used. They can verify properties exhaustively that cannot even be tested with functional tests. A simple example is that a 64-bit or 128-bit adder can be exhaustively tested with formal verification tools. Constrained random or brute force sweep testing cannot do the same in our lifetimes.
@gregparrott
2 жыл бұрын
Just a few weeks ago, an 'Asianometry' video listed ~90k subscribers and the speaker said he would like to reach 100k subscribers. He made this request again here (15:54) on the day this video was released (12/5/21), and it lists 140k subscribers (including me). It looks like his base is quickly growing.
@Asianometry
2 жыл бұрын
I made the video 3 months ago.
@gregparrott
2 жыл бұрын
@@Asianometry Thanks for the reply. More importantly, your channel is fast growing ...well deserved.
@--027
2 жыл бұрын
Actual pros using modern software to create
@martinchabot_FR
2 жыл бұрын
FPGA for verification are thing of the past (mostly). We still use them but for very limited soc or single IP verification. Today we rely on accelerated emulation like Synopsys ZeBu which can hold massive design at ease, allowing multiple instance to run different tests.
@reinerfranke5436
2 жыл бұрын
Yes, splitting design and verification is a TREND. But it create additional head splits inot the group domain. Most of verification failures are because of head spread. Yes, there are ways to make intentions, assumptions, issues, unsolveds and so on more explicite by pathing them to tools and edoc. Management see the head split as easy fluid resource problem, overlooking that it could create more problems than to solve. Bob Pease at the DAC98 say on a plenum which i attend: "Try to make zero phone call designs". Communication is the key about what things have to do. Miscommunication is THE failure, everything other is a matter of resource, planning and exercise. It belong to final silicon user to designer as well as in between designers and verification engineers. Again management try to shorten by introducing an added work rule mode. I found it more useful to let verify own designs by other designers as most verification engineers have a software background. Yes, like make and regression. No matter it applies also for harware in a good way. But it do not solve the work ethics: "Ok, i miss it. Lets make it in the next release". They do not understand the difference between a 1h compile and a 6 month mask cycle. They did not breath that a release cycle in hardware could burn a company. And my personal experience is the tension about the careless and lazy software design principles and hardware design ethics. In the future i think that the node slow down create a more healthy ecosystem where market sharing give the opportunity to specialize and design more robust, deeper tested and cheaper hardware components like decades old libraries today in the software world. SOC design today is a hurry to create a hole system every 6 months.
@gazzy01
2 жыл бұрын
Taiwan is such a massive force and motivation to everyone around the world. Indigenous Semiconductor industry is non-existent in India. I hope semiconductor foundries come to India as the US takes nearly all of the advantage of Indian talent.
@jamesmorton7881
2 жыл бұрын
The Meta-Stable State, each wave of designers just don't get it. Sychronized clock domains.
@sUmEgIaMbRuS
2 жыл бұрын
I've specifically aimed to be a verification engineer ever since I heard about it being a thing during my Bachelor's in EE. This dream came true recently, but I've been struggling to explain to friends and family what exactly it is that I'm doing. It's cool to see it addressed in a popular channel's video 😀
@Asianometry
2 жыл бұрын
It’s cool to be called a popular channel. Thanks
@sUmEgIaMbRuS
2 жыл бұрын
@@Asianometry 100K subs means top 10 in my country :)
@jonathanthomas7287
2 жыл бұрын
which country?
@sUmEgIaMbRuS
2 жыл бұрын
@@jonathanthomas7287 Hungary (Central Europe)
@jurevreca9229
2 жыл бұрын
Hey Asianometry, I really love your videos. In case you take requests for video ideas, I would love to see a video on the growing open-source EDA tools ecosystem (Chisel, cocotb, OpenRoad, OpenRAM...).
@insu_na
2 жыл бұрын
It probably starts getting really fun for verification engineers when quantum effects also need to be taken into account 😬
@rayoflight62
2 жыл бұрын
Verification can be conducted automatically to a very limited extent. SDA tools works like a software compiler, but the output is not a code to run on a computer, but a physical portion of circuit to be built on silicon. Only in this way you can generate a chip with four billions of transistors. Verification require that you build dozens of successive prototypes and test the device for inconsistencies (Like when you add one more core but the speed of the chip decreases for some functions). It can take years, but hands on testing it is the only way to produce a faultless SOC chip like the 888. I believe that, year on year, the improvements on existing chips will become more and more marginal. Great analysis you have made here.
@neuralbrew2976
Ай бұрын
The biggest problem in verification is that design specifications are poorly defined. AI can eventually be trained to do verification, but getting humans to define well exactly what they want is a huge problem.
@theInsaneRodent
2 жыл бұрын
The fun part is that FPGAs also need to do verification in their development. So they simulate future FPGAs by using several current FPGAs.
@КирилоХацько
2 жыл бұрын
There are many people from IT (or want to became IT), so they really fast spot relation to software engeenering. But. If you want to became realyy excelent IT member. You will get 10 years or more expirience, dig to IT history. And then dig to other things. History of engeenering itself, lingcistic, politics (yes you can have insights from there for coding). And you will find out, that software industry is very young. It lacks many existing answers. Problem here - no standartisation. You need to be able to get core "from intell", "from amd" and "from xiling" and put on standart interconnect. Acoording from mine understanding of silicon industry - this will be done with wide risk-V adoptation in 5-10 years.
@crytoorlabs6347
Жыл бұрын
Do you know what's the problem with KZitem? It pays handsomely to people like you :) who can express and explain things clearly and consider things from different angles. Some of those, quit making videos after sometime once they reach certain wealth level. So we the consumers of such nice videos left with only braindead people's videos. Please don't stop making videos no matter how rich you get will you?
@robertpearson8546
2 жыл бұрын
Your "verification" does not include all the design mistakes made in the planning stage. For example, both Intel and AMD CPUs are based on the 1945 von Neumann architecture. Direct execution stack machines have been shown to be superior, but are ignored by almost all manufacturers. Also, verification is very different from validation, which attempts to determine if the chosen design is a valid implementation of the problem.
@JohnnoNonno
2 жыл бұрын
I know it's OT, but I never understood why, if wafers are round, chips are square. Wouldn't exagonal chips waste less material? more chips, less waste, more room for printing errors, hell even more room for dissipation.
@WalterBurton
2 жыл бұрын
Pleasantly surprised. Thanks, Mr. Algorithm. Burnt-out old software developers like me love this.
@x2ul725
2 жыл бұрын
Some stuff like traffic signals for railways require even more verification steps. They require fail safe circuits so unsafe situations cannot be latched into place by failed software or failed hardware. This is not easy and turns out to be very expensive. Standards...
@robertlaw4073
2 жыл бұрын
Interesting. The "black box" approach to systems engineering seems to be what your referring to... assume that your building blocks do what they say they do. But reality is that there is poor QA. And modern documentation is just garbage, probably because the code and even the language itself is "short term", meaning it will get a fix "in the next version".
@johnweiner
2 жыл бұрын
A lot acronyms and abbreviations without explanation in this video...IP, for example has very different meanings in the chip world and the internet protocol world.
@mhamma6560
2 жыл бұрын
Were you born in the US or CA (i'm leaning CA)? In TW now!? Usually that line of work would leave one little time to create vids, curious to hear your story. Thanks
@MooseBoys42
2 жыл бұрын
Verification coverage gaps are largely mitigated these days by three factors. First, most devices are online and able to receive software and firmware updates that can work around the problem. Second, most consumer devices are relatively inexpensive and designed to be replaced after less than two years, so the effect of a faulty chip is limited in scope. Third, mature supply chains mean that spinning hardware revisions is cheap, so tight iteration with progressively broader test audiences can be used. Also, "respins" are *extremely* common, and very inexpensive. Most GPUs are shipped on their third or fourth spin. It doesn't require reverification or relayout. Instead, chips are built with disconnected transistors and basic logic scattered around. If there's a bug, a change in the metal mask can be done very easily to tweak the logic.
@Travlinmo
2 жыл бұрын
This feels like one where you could re-do this ever year or two to update. Something is likely to give to speed up the QA portion of this (i.e., someone decides getting an AI involved and builds AI chips just to build the test routines).
@OrangeC7
2 жыл бұрын
These two videos were an awesome peek into how people actually manage to create the magic black box that's in all of our modern devices!
@xybersurfer
2 жыл бұрын
ambiguous specification and after that designing with wrong assumptions causes the need for verification. that's what i'm taking away from your traffic light example. why is this information suddenly available during verification?
@TechOrionater
2 жыл бұрын
I have yet to see anyone correct the mistake on 4:49 in the comments. So I'll correct it. The mistake is that if there is [no] [Traffic on blue street] it [waits one minute], When instead it should go back to [Traffic on red street].
@msimon6808
Жыл бұрын
Verification is very important. It is why you build prototypes. Only idiots go from design to production. And the usual reason is - "it will save time and money". I have never seen a case where it does. Ever.
@robertmccully2792
2 жыл бұрын
Its the same in any manufacturing. The best inspectors are the builders. Unless the builders are dishonest, which is why they try to keep the two separate.
@pdbsstudios7137
2 жыл бұрын
so if you will start producing chips todays you could beat Elon musk's balance
@RohitKulshreshtha
Жыл бұрын
Constrained Random Verifications sounds a lot like fuzz testing that’s popular in software.
@Samsul2013
6 ай бұрын
I thought the video is about the open source tool for chip design , few videos I watched were about the chip development in the past. Good for students who wants to know about the chip dev history.
@nikolaradakovic5050
2 жыл бұрын
It's impossible to make a testbench for every possible case, that's where functional coverage and formal verificiation kicks in
@theowl2044
2 жыл бұрын
I just hopped over to the silicon device industry, which I know nothing about. I didn't study all this in EE. Thank you for your videos. I hope your channel takes off.
@TheFreshSpam
2 жыл бұрын
Your videos are amazing on all topics. I can watch all of them all day. Great introductions and layouts to complex things we dont hear enough about
@watercat1248
2 жыл бұрын
With how greedy the company became I will not surprised if board manifction follow the same steps off video games on the future Meny Big studios release video games every year and the have very short dead lines the results is the video games release unfinished and broken and all this because the studios is greedy is very likely the same to happen in meny other aspects in the future wean time going everything become more mass production because off this the use chipper components and probably have less testing ECT
@spehropefhany
2 жыл бұрын
Thanks. Amusing that you are using software metaphors to explain physical designs. We've come full-circle.
@petergoodall6258
2 жыл бұрын
One man’s hardware is another man’s software
@shelvacu
2 жыл бұрын
3:08 Go watch this other video first No card, no link in description, no comment, can't even find it in your uploads...
@andytroo
2 жыл бұрын
verification feels a little like the halting problem. verify that the only thing this does is X. much harder than 'show that at least the following behaviours exist'
@williamlouie569
Жыл бұрын
It's like writing a complex computer program. There are going to be bugs as the chips become more and more complexities.
@sauravkumargupta2379
2 жыл бұрын
I am a formal verification engineer and I totally agree with ur Video. The last part of the video is what we do in formal verification. I.E randon generation of number.
@theexchipmunk
2 жыл бұрын
You know, after this video I do not even want to know just how many thousand hires of work go into designing a more complex chip.
@giacintoboccia9386
Жыл бұрын
"It used to be that teams had a lot of time to complete a design" is true, but I may smile thinking of the Intel 400X family.
@Fast_studyIQ
2 жыл бұрын
You should have mention the country where verification is originated and still growing !!
@bearcb
6 ай бұрын
Working on verification for more than two decades here, nice video. I'd just make some clarifications: 1- most of functional verification is done through simulation. The video shows emulators and FPGAs, which are often used, yes, but they are complementary resources. Same for formal verification, which is not mentioned. FPGAs and emulation are fast enough to allow testing of SoC firmware, which may not be viable in simulation (too slow), so that's the main use, not hardware verification per se. 2- in SoC verification one doesn't cover all the functionality of the IPs used. They are assumed verified already, so the focus is on their interconnection, aka integration verification. Usually IPs are verified isolated, covering their whole functionality, before they are used in an SoC.
@valentinohose1723
6 ай бұрын
hi im EE student graduating this year. how do you think about verification engineer as career path compared to design engineer?
@bearcb
6 ай бұрын
@@valentinohose1723 there is more demand for verification, because at least 2 verification engineers are needed for each designer, ideally 3 (which is not often the case, unfortunately). However design is usually preferred because it's seen as more creative, and has less pressure: the deadlines are beared by the verification team. Verification closure is the last deliverable before moving on to layout/fabrication for IPs/SoCs
@valentinohose1723
6 ай бұрын
@@bearcb thank you for your answer. as someone with 20 years of experience in the industry, would you recommend pursuing a career as a verification engineer? I'm interested in understanding the potential career advancement opportunities in this field. I've found verifying 32-bit RISC-V with UVM in a school project to be quite enjoyable.
@bearcb
6 ай бұрын
@@valentinohose1723 it's a personal decision, you have to balance pros and cons. Demand will always be there. If you enjoyed your verification experience, I'd say go for it. But again, it's very personal.
@valentinohose1723
6 ай бұрын
Thank you
@autohmae
2 жыл бұрын
I hadn't heard of Unsplash, turns out it also is owned by Getty. :sadface:
Пікірлер: 487