I've been in Computer Science graduate courses and lectures in one of the top programs in the US where it seemed like every week the first 5 to 10 minutes was spent trying to get the projector to work with the lecturers laptop.
What amazes me on conference calls is how no one can explain audio issues. Really, really smart people on a call that can explain registers, cryptography, kernel tuning.
Sounds like you are having network issues/bit errors? You are breaking up.
Sounds like your bluetooth is having issues? You are breaking up.
Sounds like they are mobile, and in a weak service area? You are breaking up.
As a person who is an audio engineer, dsp programmer and supporting such systems professionally:
That is because the breaking up for the most part sounds very much the same. It is a CODEC specialized for low bitrate transmission going into low bitrate compressions, packets being dropped entirely or arriving with a delay etc.
Who knows where such a thing comes from? It could be any number of things on a huge technological stack distributed over a large geographical area.
A domain expert should of course be able to distinguish various bad conditions by ear, e.g. clipping, saturation, wromg microphone distances/orientation, signal interferences, hum, bad grounding etc.
But not all people are good at analytical hearing regardless of them being engineers or not. That is why people pay the likes of me.
To say the network guru should automatically know the cause of a packet drop is like to say the postal worker automatically knows why the letter never arrived, or a doctor automatically knows the cause of that cough.
This is mostly a consequence of the complexity of all the abstraction layers between "conference call" and "physical routed and switched connection". At best you might be able to identify that the visible/audible symptom is due to packet loss, but proving a root cause of the packet loss is extremely difficult. Through some tooling, like ThousandEyes (which I work on), you might be able to identify the hop in the path that's causing that forwarding loss, but unless you have access to that device it'd be impossible to prove exactly /why/ it has forwarding loss.
Any type of problem like that ultimately becomes a "5 Whys?" kind of troubleshooting to get to a real root cause, and from a end-user device you generally don't have the necessary access or data to answer more than 2-3 layers of abstraction.
I try to be more specific, but there's just so much going into modern sound processing. Told a guy once his bluetooth connection was failing, but the real issue was that he had a plane taking off overhead and the VC software was noise-cancelling everything.
Are those things distinguishable? I thought people say you're breaking up because they only see the symptom of audio cutting out and have no way to distinguish what the cause is.
Mediatec professional here: hard/impossible to distinguish especially, because knowing precisely which signal processing your video-conferencing solution might be doing under which conditions and in which version is even hard to know for people who analyze one version of software in a lab environment.
That is like a relative saying you are not an IT expert because you don't know immidiately what the distinct root cause of "the screen being black" is.
I just have a ChromeCast in each room that I lecture in. Easy to bounce a tab over to the big screen, and it turns on the TV/projector. It also helps prevent notification or other screen sharing related mishaps.
My biggest complaints about this approach are that I can't easily see speaker notes (I have a fiddly workaround that I can use if I need it, but it would be nice if Google Slides would support the ChromeCast use case a little better..) and that the TVs/projectors tend to screw around for 10-15 seconds before automatically choosing the right source.
I know I can open other tabs to look at the presentation, so I assume it would. But it's not good optics for me to have a phone out when I ask my students not to do so.
I can get a speaker notes and a normal presentation tab open and cast one of them, but they all jump around and do the wrong thing on my screen (wanting to maximize, etc) and require a lot of coercion; too much time to start a normal lecture.
Arguably it still hasn't matured yet. Or at least it's nowhere near to be a solved problem and we'll probably sidestep it before ever coming up with a real solution.
I'd put it in the same bucket as TODO lists. After centuries we're still not set.
Depends on the printer. I still remember the LaserJet 4000 series with great fondness. They were absolute tanks that almost never broke. You did have to replace wear items like rollers, but that was it.
I used to joke it was ironic how people in management would always have their presentations up and running almost immediately while programmers seemed to always struggle with projectors.
These days however nothing seems to work for anyone.
Technically yes, but since the WM usually comes as part of a desktop environment that also includes utilities and daemons which help with that sort of thing, to many folks "my WM makes a big difference in how easy it is to set up external monitors" is often true.
It drives me crazy that there's not a long running named process in e.g. GNOME the DE that if I run will cause my fn-volume keys on my laptop keyboard to change the volume any WM, the same as when I'm running GNOME? Why isn't that piece separated out? Or at least, where is the code so I can separate it out?
Questions and experiences like that cause many to conflate behavior and experience with WM/DE.
I once helped Andrew Tannenbaum get his laptop to work with the projector in the room. His presentation was only slightly delayed because of it, but the irony ...
The burden shouldn't be on users. Why are computer manufacturers putting out a hardware/OS combination that can't work reliably with an external display? Don't they even test this common use case?
Even my Mac (which, to Apple's credit, does work 100% reliably with projectors) still struggles when I plug in multiple external monitors, blanking the screen, turning on one external display, blanking them both, turning on the other one, and so on. This is a manufacturer who normally accepts only a smooth, polished user experience, and they still can't get it perfect. What chance do Lenovo or similar garbage-tier plastic box manufacturers have of getting it right?