What I don't understand is why it took you 8 weeks to distinguish a timer from a transistor. That doesn't make your professor's reaction alright, I just find it puzzling.
It's a good question! I didn't think to check the markings on the chip. The lab tech was convinced I was doing something wrong with my setup, and likewise he had me convinced it must be something wrong with my setup.
Coincidentally, I've been knee-deep in some problems that I've applied the Cynefin framework to. I'd call this problem "chaotic", where throwing things at the wall might be _more_ effective than working down a suggested or tried-and-true path from an expert. I was pleasantly surprised just a few weeks ago where one of the more junior engineers on my team suggested updating a library - something I hadn't considered at all - to fix an issue we were having. (That library has no changelog; it's proprietary / closed source with no public bug tracker.) Surely enough, they were right, and the problem went away immediately - but I was convinced this was a problem with the data (it was a sporadic type error), not a library problem.
No offense, but... when I was reading your story, I was somehow at least assuming that the marking on the part was somewhat unreadable or something...
After getting befuddling answers, would it not have been natural to check the base assumptions, starting with do I have the correct part? That is true as much in the "real engineering" world, as in school.
You say "It could _never_ be the equipment's fault" as if it was, but it wasn't. The test equipment gave you correct answers, your device under test was wrong.
I'd say it's not natural to check for the correct part that has been given to you by an authority that claims to have done so, but a learned problem solving technique.
Or even more likely in a lab setting: have another student test your part in their setup for A/B validation testing.
Sort of like the first debugging tip here:
> 1. Understand the system: Read the manual, read everything in depth, know the fundamentals, know the road map, understand your tools, and look up the details.
Maybe? Although it seems more like it's actually #5:
> 5. Change one thing at a time: Isolate the key factor, grab the brass bar with both hands (understand what's wrong before fixing), change one test at a time, compare it with a good one, and determine what you changed since the last time it worked.
where in my imagined scenario, a student that just finished the lab successfully could pull out their DIP-8 device and swap in the author's to validate that it was possible to make it work in a known good environment.
That would be like exposing a first year CS student to a situation where "it could be a compiler bug" is one of the potential explanations.
It's closer to exposing a first year CS student who has never touched a computer before to Windows, when the work is supposed to be done on Linux, and the TA is hemming and hawing, and insists that the reason the sudo command isn't working is because the student is not following the steps correctly.
It's a problem that's obvious to diagnose... If you already have passing familiarity with the material. Most people do not have passing familiarity with electronic components when they step into an engineering program.
The part was marked as a timer IC.
Not just a timer IC. Literally the most common IC in the world for at least every year from 01980 to 02000 or so, maybe still today. I can understand the first-year student not recognizing it, but what the fuck was the lab tech's mental disability?
I would assume that you don't have access to the lab(and diagnostic equipment) at all times and taking other classes.
Also him being a student, having the wrong component was probably not in his mental troubleshooting tree. I would guess that it was not in the lab assistant's troubleshooting tree either.
Also once you start down the road of troubleshooting, a false trail can lead you far into the woods.
Same package. 555 is typically a DIP-8, transistor packages are available in the same. So you would have to examine the cryptic markings and compare them with the other students, and that’s only if you suspected some fuckup on the part of the knowledgeable people.
ALWAYS suspect some fuckup on the part of the knowledgeable people... especially them!
Trust, but verify.
Yes, my strict adherence to “trust but verify” was born from literal tears. It’s not worth trusting others if it takes a small fraction of the projects time to verify. It has saved me incredible amounts of time in my professional life, and I’ve seen months wasted, and projects delayed, by others who hadn’t cried enough yet.
I would love to hear some of your examples, if only to reinforce your lesson to myself.
“Is the box plugged in? Did you cycle the power?”
I’ll trust that you understand each of those words individually but later verify that the box is actually plugged in.
That's why tech support has moved on to "unplug the thing, wait a minute, then plug it back in".
It gives the capacitors to discharge; but more importantly, it gives an excuse to actually force the person to plug the thing in.
I ask people to unplug their Ethernet cable and tell me the colors they see on the wires all the time.
I don't care, of course. But they'll happily do that, where if I ask them to verify if the cable is properly plugged in, 99% of them will just say yes without so much as glancing in the cables' direction.
Earlier in my career the clients system was not powered at all, I did:
"Is it plugged in and switched on?" A: Yes, to a powerboard.
"Is the powerboard plugged in and switched on?" A: Yes.
I did the onsite visit and found the powerboard plugged into itself.
Normally I would facepalm and curse the idiocy but... it was a care respite facility and they had more pressing issues to deal with that I wouldn't want to deal with - their role is heroic I feel.
And an easy win already makes my day so I sorted it, told them it was fixed with a smile, and continued on.
"Trust, but verify" is just a polite (ie corporate) way of saying "Don't trust until you verify", right?
No, it says that projects should move forward without verifying that prerequisites have been fulfilled, but that the verification should take place anyway. It's about the pace at which you can go.
Trust-free:
Ensure that step A can go off without a hitch.
Begin step A.
Ensure that step B can go off without a hitch.
Begin step B.
Ensure that step C can go off without a hitch.
Begin step C.
Trust, but verify: Begin step A.
Begin step B. Check that you have whatever you need for step A.
Begin step C. Check that you have whatever you need for step B.
Check that you have whatever you need for step C.
You can't finish step B until you have all the prerequisites, but you can start it. I can only make sense of that saying in terms of how much trust to give, whether it's a high-trust or low-trust environment. Whether you assume good-will and basic competence or not.
e.g. you might assume that a sorting library from an internal developer at your company will put things in order but you might want to verify that it has reasonable worst-case performance for your use case. A no-trust situation might lead you to scrutinise everything about it - does it work at all, does it have horrendous performance in every case, is it a supply-chain attack with disguised errors leading to deliberate exploit holes.
In this case, "trust but verify" might mean assuming the Professor and TA are doing an experiment they have done before, which basically works, but might have made a mistake or missed something while setting it up, writing the slides, or explaining it to you. "Don't trust" might mean the TA got the experiment from ChatGPT, hates OP for being on a scholarship and is trying to sabotage their success, and the whole thing isn't an Electronics course it's really the Professor's practical joke/psychology experiment about stressing students.
But that's the thing that both students and often the teachers forget. We don't run labs to go smoothly, we run labs because you'll have to troubleshoot. There is no learning experience in a lab that works without issues, in fact IMO if lab instructions are of the step by step type, they should always have some deliberate errors in it to get students to troubleshoot.
To play devil's advocate, just imagine the previous posters Story at a company, i.e. a junior engineer not being able to make some simple tasks work and telling their supervisor "it doesn't work" and it turns out after 8 weeks they grabbed some wrong part. Should they have expected their supervisor to check all the parts? Should they expect a good performance evaluation?
If after eight weeks a junior engineer is still toiling on their story, I'd ask why someone more senior didn't get involved.
There are lots of reasons - maybe the senior engineers are overburdened with other work (or don't care), maybe the project manager or team lead wasn't asking if the junior needed help, or maybe the junior was lying about their progress.
Either way, a story that goes for eight weeks feels excessive. Much, to your point, taking eight weeks to figure out that there was a bad part feels excessive. My counterpoint is that teams don't typically operate like labs. In a college lab, the objective is for you, specifically, to succeed. In an engineering team, the objective is for the entire team to succeed. That means the more senior engineers are expected to help the more junior engineers. They might directly coach, or they might write better documentation. I don't believe that dynamic is present in a lab setting.
> Should they expect a good performance evaluation?
They should expect that particular incident to not affect their performance evaluation, since it was very much not their fault.
In your hypothetical scenario, your hypothetical junior engineer went to the senior engineer repeatedly for advice, and the senior engineer did not do their job properly:
The lab tech was unhelpful, insisting that it must be something with how I had it wired, encouraging me to re-draw my schematic, check my wires, and so on. It could _never_ be the equipment's fault.
This is a huge failure in mentorship that wouldn't be ignored at a company that actually cares about these things.
> They should expect that particular incident to not affect their performance evaluation, since it was very much not their fault.
What do you mean not their fault? I've seen wrong parts delivered by suppliers, so yes responsibility of an engineer who puts together a circuit is definitely checking that the parts are correct.
> In your hypothetical scenario, your hypothetical junior engineer went to the senior engineer repeatedly for advice, and the senior engineer did not do their job properly:
>> The lab tech was unhelpful, insisting that it must be something with how I had it wired, encouraging me to re-draw my schematic, check my wires, and so on. It could _never_ be the equipment's fault.
Again _never_ the equipment's fault? It wasn't the equipment it was a part. So maybe it was an issue of miscommunication? I find it hard to believe that the lab tech said it could never be the parts, considering how those things are handled in student labs, small parts break all the time.
Maybe, it's true and it was a crappy lab tech, maybe they could not imagine the part being broken, but I've seen the other side of the equation as well, when things don't work students often just throw their hands up and say "it doesn't work" without any of their own troubleshooting expecting the tutor/lab tech/professor to do the troubleshooting for them (quite literally, can you check that we wired everything correctly...).
In my experience this does not get accepted in industry. I acknowledge though what the other poster said, generally in industry incentives are different and someone would have intervened if a project gets held up for 8 weeks by a single person.
Regarding the story, I wonder what would have been an acceptable solution (apart from the lab tech possibly being more helpful?), I as a teacher would have excepted a report which would have given a detailed account of the troubleshooting steps etc. (but it needs to show that a real effort to find the cause, simply saying the lab tech couldn't help is not sufficient). Simply saying "it wasn't my fault because I had a wrong part" shouldn't just give you an A.
> What do you mean not their fault? I've seen wrong parts delivered by suppliers, so yes responsibility of an engineer who puts together a circuit is definitely checking that the parts are correct.
A student is far from an engineer.
> Again _never_ the equipment's fault?
The exact words the failed mentor used are not what matters here.
> In my experience this does not get accepted in industry.
This being the entire situation or the actions of the improperly used junior employee? Blaming the non-expert that was refused help is scapegoating.
> Simply saying "it wasn't my fault because I had a wrong part" shouldn't just give you an A.
It should give you more time.
> What do you mean not their fault? I've seen wrong parts delivered by suppliers, so yes responsibility of an engineer who puts together a circuit is definitely checking that the parts are correct.
Let's not move the goal posts, please. If you're going to use a hypothetical situation as an analogy, make sure it's actually analogous. Yes, an engineer who puts together a circuit has that responsibility, because they're an engineer. They went through the required training that makes them an engineer and not just an engineering student.
> I find it hard to believe that the lab tech said it could never be the parts, considering how those things are handled in student labs, small parts break all the time.
And therein lies the problem. You "find it hard to believe" that the lab tech could have been that unhelpful, just like the lab tech found it hard to believe that the student wasn't doing something wrong. Both you and the lab tech are behaving in a way that is inappropriate for a senior mentoring a junior.
In my experience mentoring others, the first assumption should not be that the person you're mentoring simply didn't do enough and that they should try to do better. Yes, that might end up being the case, but most of the time there's also something else that could have been done better. Maybe the documentation is not clear enough, maybe the process didn't help catch the mistake, maybe the expectations I set weren't clear enough, maybe I didn't communicate well enough.
"Go check your work again" is rarely helpful, even in the extremely rare cases where that's the only thing that needed to be done and no other improvements exist. If you're really convinced that they merely need to check their work again, guide them to it.
That's why they are junior and you are senior, because they need more guidance than you do. They will not develop the necessary insights and instincts without that guidance.
> I've seen the other side of the equation as well, when things don't work students often just throw their hands up and say "it doesn't work" without any of their own troubleshooting expecting the tutor/lab tech/professor to do the troubleshooting for them (quite literally, can you check that we wired everything correctly...)
And in turn, you're arguing that the mentor should merely throw their hands up and say "go check your work yourself". Again, even that can be said differently: "Can you explain what you have checked so far and how you've checked it?"
> Simply saying "it wasn't my fault because I had a wrong part" shouldn't just give you an A.
You are drawing a lot of your own conclusions from what hasn't been said. In this comment thread, you have repeatedly and consistently shown bias through your assumptions. Yes, what you're saying could have been the case, but I see no evidence of it and no reason to simply assume it without at least inquiring about it.
For a college class grade, everyone is supposed to be tested on the same exercise. If all students were tested under the same scenario then it would be fair. For just one student to be tested under this scenario, but for all other students to get a free pass on the lab component identification diagnostic test, is not reasonable.
More to the point, the professor would be required to provide the same effort to every other student in the class.
While it's ridiculous to expect a student to have the skills of a professional, even a student needs to develop assertive skills to demand a replacement part. This is a basic skill for debugging hardware problems: see if problem manifests on more than one unit. Here it would be demanding another chip to try, early-on. Chips can be marked correctly but damaged or defective.
> "But that's the thing that both students and often the teachers forget. We don't run labs to go smoothly, we run labs because you'll have to troubleshoot."
Hands up everyone who remembers being taught that labs were supposed to go wrong and you were doing them because you will have to troubleshoot?
... anyone?
... anyone?
... Bueller?
Or is this just the typical internet John Galt like that other guy "no offense but why didn't you just already know everything and create an apple pie by creating the universe like I would have?"
Ohm lordy, we're blaming the student for not having years of homebrew experience before he entered school? Sure any hobbiest knows what a 555 is, but when the lab assistant doesn't even catch it and the chip was handed out to the student this is not an entry-level students fault.