Hacker News new | past | comments | ask | show | jobs | submit login
ProctorU is dystopian spyware (shkspr.mobi)
602 points by smitop on Nov 9, 2021 | hide | past | favorite | 350 comments



The whole notion of online proctoring seems pretty whack to me also: what world are we training and testing people to live in? The real world has internet, you can search for stuff, you can work from home and take a break.

I'm not good at things because I'm always able to magically materialize the right answer out of my mind inside an anechoic faraday chamber, I'm good because I get the right answer in a reasonable amount of time using all the tools available to me (the internet, my brain, books, whatever).

I get that there are a few professions where you're really going to need to know the right answer quickly without looking it up under high-pressure situations, and it seems like testing and certifying doctors and lawyers is important enough to society that we can arrange to proctor those tests in person (and still maintain covid safety, disability/poverty access, etc). That level of extreme proctoring security seems pretty dumb for random exams WHERE school <= undergrad.

If you elaborately cheat your way through all online classes all the way through undergrad, that seems mostly bad for you


Having been through a lot of school and gotten a lot out of it, I am strongly convinced that the current society-wide policy of treating university education as expensive vocational training is a horrible and costly mistake whose main function for the overwhelming majority of students right now is to saddle people with debt. Universities primarily exist to train and employ researchers, and serve a secondary purpose of providing a broad-based education to members of the public. Anything that is useful vocational training in this process is basically an accident, and useful vocational training can be done in a much more time- and cost-efficient way.

I also think the way we test people is set up to be measurable rather than effective, and all this hand-wringing about cheating is an attempt to prop up a premise about pedagogical techniques that simply do not accomplish their goal (and a way to scam universities out of a lot of money for tools like this one). Most of the market for standardized tests is a racket built on market dominance and irrational beliefs about its value built on a (pervasive, policy-level) misunderstanding and subsequent cargo-cult style blind worship of metrics per se

On top of all this, the sheer level of security and privacy violations the school system seems to tolerate for this dubious purpose is ridiculous, and speaks to a deep gap in knowledge about, or devaluation of these things that causes far more problems in our information-driven world than cheating on a dumb test possibly could

I've gotten advanced degrees and consider them to have been overall a valuable experience, but even then most of the education system as we have it was pointless torture, and I probably would have quit school if this nonsense were around when I was in it


"... I am strongly convinced that the current society-wide policy of treating university education as expensive vocational training is a horrible and costly mistake whose main function for the overwhelming majority of students right now is to saddle people with debt."

This is not said often enough and it is a shame.

Universities are not intended to be vocational schools. They're research institutions where the value they generate is the knowledge they discover. The knowledge discovered is often frivilous and arcane with no real use outside of a very specialized subsection of a field of study. However, there's also the knowledge which is 'useless' at the time of its discovery but changes the world later on as its studied further and applications are discovered.

E.g. https://math.berkeley.edu/~gmelvin/math54f12/math110su12_gra...

I don't think there's much more to add than Turchin's ideas of "overproduction of the elites" and it rubs me the wrong way when I hear intelligent people gripe about how university credentials and research have marginal vocational utility. The problem's with a culture that celebrates credentialism and gets people to take on crippling levels of debt to avoid an impovershed future, not scholarship.


Universities are not intended to be vocational schools.

That famously changed under Ronald Reagan. Then came "everyone should go to college", while maybe 10%-20% of the population usefully should.


As a former educator, the only thing I can add on top of this is that training compliance, obedience, and mindless rule-following is a feature, not a bug of the education system.

And I feel like it's drilled into instructors too.


I strongly agree. I think this has been the case for all versions of the public education system in living memory, and the main ways this has changed over the last few decades are twofold:

-More oversight and efficiency in this process is breaking down the unofficial means through which anyone got anything else out of it (e.g. real mentorship from educators who care that is not encouraged so much as tolerated by education policy, opportunities to engage with new ideas in a meaningful way, good reading recommendations, an incidental avenue into a social life)

-The structure and function of public education is increasingly infecting university curricula, structure, and priorities

I think this likely has the effect in broader society of creating less fluid competency and more blank-faced compliance, and this disproportionately affects people in key leadership roles as schooling becomes more of a selection pressure on people's career paths


I'm interested in perspectives on this from outside the US or Anglosphere generally - I'm from Australia and the education is largely similar if not worse as it inherited British hierarchicalism (is that a word? it is now!)


Everything you said is spot on, and it pains me to type that.


> I get that there are a few professions where you're really going to need to know the right answer quickly without looking it up under high-pressure situations

I agree with the premise, but I think you're downplaying this quite a bit. Pretty much any job that you're not doing asynchronous work in front of a computer would apply here.

> If you elaborately cheat your way through all online classes all the way through undergrad, that seems mostly bad for you

Well it's more than bad for just you. It's bad for the university and everyone else with a degree from there since you are degrading the value of having that degree.


> Pretty much any job that you're not doing asynchronous work in front of a computer would apply here.

It's lame to ask for examples, but do you have any? Because honestly I don't think it applies to many jobs, unless it's something that a) puts you in high pressure situations, and b) requires specific in-depth knowledge. There aren't that many jobs like that, and for the ones that do exist (medical, military), you don't get in just by doing a bunch of multiple choice tests.


> for the ones that do exist (medical, military), you don’t get in just by doing a bunch of multiple choice tests

As someone in the medical field, I would say upwards of 80% of the graded portion of a nursing license and it’s certifications (trauma nursing, cardiac life support, etc), you’re graded entirely on multiple choice questions. People die from mistakes, yet I meet a lot of people who proudly admit to having cheated or googled their way through a much of their studies. Some of them are good and quite a few are downright dangerous at their jobs.


> I would say upwards of 80% of the graded portion of a nursing license and it’s certifications (trauma nursing, cardiac life support, etc), you’re graded entirely on multiple choice questions.

A lot of places won't let you take the exam without practical experience, under the supervision of clinical personnel. Harder to google your way through that.


I can’t speak to the certification exams, but for nursing school, there is a high pressure from administration to pass students as long as they can pass the standardized test, even if the teachers don’t think the student is ready or safe for clinical work. So, although they do have practical experience, it is often limited in scope and students can fake their way through it to a surprisingly large extent (they’re always paired with a licensed staff member). This is anecdotal, but consistent across what has been told to me by staff members of several nursing schools.


I’d argue that a lot of jobs have time that is stressful, high pressure and requires one to have the knowledge immediately.

A truck driver, a crane operator, a musician, a teacher, a chef.

Danger doesn’t have to be physical. Embarrassment, potential failure etc are all reasons to need to know something without looking it up.


None of these can be tested by an online-proctored test necessitating the installation of a rootkit.

The first two have specific government licensing requirements that require in-person examinations in use of the equipment in question.

The third has no need of any sort of proctored test.

The fourth _may_ need some sort of proctored test, but most teaching licences require in-person or otherwise monitored practica—and those are far more viable than anything proctored. Teachers should _often_ be ready to turn to books, depending on what it is they are teaching.

The fifth also doesn’t need a proctored test, and the type of immediate no-book knowledge required when preparing something is something that isn’t readily memorizable, but is instead only something that is achievable via long experience. Most chefs work from recipes and plans.

I don’t think that any of the examples you have given fit the mold.


I think you are underestimating the amount of regulation and qualification that is required for various roles. Health and safety, code compliance and best practice are tested and assessed for a vast number of professions where I am. Certainly teaching, driving, food safety, building etc. I struggle to think of a profession or trade that has no testing or legal requirement for standards (and associated proof of compliance). Many companies require online training to get their own policies across.

Probably a key detail - I’m in New Zealand, and Health and safety is taken increasingly seriously. Company directors face stiff fines/imprisonment for H&S failures and while things have a long way to go, they have also come a long way in a short time.


I’m in favour of (sane) licensing rules. I believe that a number of licences that are offered in many places are _too easy_ to get and need more training, ongoing training, and certification to maintain a licence.

However, health and safety &c are mostly _orthogonal_ to the professional standards of the professions you listed—and in some cases (musician, teacher, and chef), are not the primary responsibility of the professional. (A tour manager / stage manager may have H&S responsibilities around a musician’s show, but musicians themselves have almost no such responsibilities.)

Structural engineers need to know how to calculate the safety of what they are building, but they may (and probably _do_) need to look up the specific regulations and requirements for different localities or structures or materials quite regularly, unless they work on a single type of structure and material in a single location that never changes its regulatory requirements.

Looking up the specifics to an answer isn’t a failure, and our training and certification regimes need to recognize that it isn’t a failure. If having split-second responses to a particular situation is required, then the training needs to focus on what is absolutely required and reflect that (much like military training does with respect to weapon safety).


When I visit a restaurant, I sometimes ask the waiter whether he knows if certain dish contains dairy. There are usually three answers:

    Bad waiters would just have no clue and won't know what to do.  
    Mediocre waiters would offer to go to the kitchen and ask about it
    Great waiters will tell me yes or no, will offer me to cook it with X instead of Y, or give me one of the other options in the menu (they already know by heart) that do not contain dairy


A good waiter certainly has great "menu knowledge", however this is actually a situation where the waiter shouldn't be trying to impress the customer with how well they remember ingredients. A customer informing a waiter of a food allergy should trigger a clear protocol (that the restaurant may have established in writing), which will involve talking to the kitchen/chef/manager.

Many places will have waiters follow up the "does X have {{ allergen }} in it?" with clarifying questions as to whether the customer has an allergy or if it's a preference. If they are indeed asking because they have an allergy, it's critical that everyone dealing with that customer's food knows in advance so they can double check all ingredients and prep/handle the food in a way that limits any potential cross contamination.


I'm glad I paid $40k to learn to be a great waiter.


I guess I should add "high stakes" to my list of requirements.


My suspicion is that for almost everything constantly having to google will be a problem. Maybe a timed harder test is the right way to filter that out and find the people with enough knowledge, but that's harder to get the balance right


And then you reach a point where you don't even know what you need to Google to solve a problem. I can't believe some people are under the impression that being successful at your job is only a matter of translating requests through the Google machine and spitting out answers. It tells me that they have not faced complex problems before.


I think the position being expressed here is more that in the real world Google is now part of many work flows and so a test that asks you to be cutoff from it is actually unrealistic. I don't fully agree though because a test is also unrealistic in that it often gives you more than enough time to solve the problem. And maybe more along the lines of what you're saying because again it's a test it can't really include truly novel problems that won't have known solutions


I completely disagree. Knowledge recollection of school taught concepts is irrelevant to practically every job. Yes, most jobs require you to remember various work-related things, but those are not taught in school, they are taught and reinforced on the job itself.


What this tells me is that you may not have gotten the most out of your education. Sure there are specific tasks that you learn how to complete for particular jobs, but the structure of how you approach problems should have been helped quite a bit by what you learn in school.

Personally I found studying philosophy extremely helpful to solving business problems. Not because there is an exact 1 for 1 concept match. Rather it teaches you how to frame and break down problems so that you can be more adaptable and efficient when faced with various unknowns.

There's always going to be domain specific learning needed for any position. But to dismiss any tangential knowledge as useless is extremely short sided.


You said “studying philosophy” rather than “being tested on rote memorization of philosophy text books”.

I agree that “studying philosophy” is a worthwhile endeavor, but I align more closely with the parents in regards to memorizing the semi-random litany of data bits that show up in a typical exam. The information is ultimately forgotten and is usually only relevant in the context of the current textbook chapter. Nearly every useful concept that applies to my two separate careers was learned on the job. The time spent in the classroom is nearly irrelevant and could have been replaced entirely by several weeks on the job.


I just gave an personal example to flesh out my post. But if, for example, you are a structural engineer who has to consult on site with clients, you certainly need to recall a litany of data/concepts (random or non-random) pretty quickly if you don't want to look like an idiot. You cannot learn this knowledge from a few weeks on the job because the client would be able to see through your bullshit in minutes.

This applies to pretty much any position that needs to communicate in real time about a base of knowledge (doctors, lawyers, logistics, any type of management position). If you believe that any career can be substituted by "a few weeks on the job", you're not really aiming that high for yourself.


I'm curious about the proportion of careers which are not adaptable to asynchronous/remote work, that _is_ best tested through rote memorization (think multiple choice questions) versus active demonstration (think like having an instructor see you do CPR on a dummy)?

Just a thought about how it's weird we argue about how testing should be done online, but not the fact we test online things which are not really online in their nature.


Pretty much anything that requires real time communication involving a deep base of knowledge. I'm not going to list out hundreds of possible roles that this encompasses - but I'm sure you have the imagination to think of a few.


[flagged]


He seems to have had a very successful career doing something that he enjoys. I don't really understand your point.


I think Conan would like this joke. Sounds like one Andy would make at his expense back in the day.


It's somewhat amusing to see someone write a lengthy rant against testing because memorizing information is archaic, when the ability to write, at least at the start, is learned this way.

The most obvious example why you're wrong is learning a second language. Even learning a language outside of a classroom generally requires memorization of fundamental vocabulary (and sometimes a writing system); it therefore makes sense that a class teaching a second language would test students on this memorization.

Now, other fields might not be so obvious, but they require many things to be quickly accessible, much in the same way that one might need quick access to a word or a character with a second language. A biologist needs to know and understand the central dogma, an engineer needs to understand mechanical stress, and a chemist needs to be able to read a structural formula; they need to be able to do these things essentially instantly (and understand them intuitively) to be able to even discuss more complex topics, because complex information builds on simple information. Hence it makes sense to make sure those pieces of information are understood and readily available before sending a student off to a more advanced topic.


> t's somewhat amusing to see someone write a lengthy rant against testing because memorizing information is archaic, when the ability to write, at least at the start, is learned this way.

Writing is a skill. It is NOT memorization. It is not taught or learned that way, even at the beginning, unless you are talking about handwriting.

No one says, "Memorize this phrase, and now write it out again."


I mean you are just wrong. You memorize what sound (well sounds) the letter "a" makes, which is of course fundamental to being able to write because English uses an alphabet. You might argue that reading isn't writing, but the two are interdependent (or at least writing generally dependent on reading).

Not to mention memorizing how to physically write (i.e. what you're talking about with handwriting)...

I assume you think I mean composition or something, which is of course extremely obviously not what I meant... But to develop that skill of "writing" that you're talking about requires memorization of the fundamentals... Which applies quite broadly to other disciplines...


It's not memorization, it's pattern recognition that really counts. What sound "a" mkes is irrelevnt if you're recognizing the relevnt pttern. So it's you who's in the wrong.


You cannot reasonably intuit the sounds that letters make, or the meaning of combinations of letters without some underlying phonetic information. Your ability to recognize those patterns is built upon learning the meaning of words and letters as I described (not to mention whether what you're doing is pattern recognition or pattern memorization).

You are just wrong in every possible way.


there is qualitative difference between simple memorization and generalized pattern recognition. You fail to distinguish the two very distinct things, one of which admittedly is the base of the other, but the other one is very much required for higher cognitive function.

Memorization can be done on a cellular level, generalized pattern recognition - not so much.


Lol. Ignoring the general absurdity here, none of this has to do with the basis of reading/writing, which in English is a set of phonetic letters. I've described why these must be memorized.


Welp. We are in a dead-end.


Lol.


Not everything IS memorization, but lots of things HAVE memorization. You don't write by regurgitating exact phrases, but your ability to write will be significantly impeded if you need to constantly look up word definitions, spellings, word order, syntax rules, and orthographies. There are certain fundamentals that need to be automatic, and it's fair to require internalization as part of acquiring mastery.


> No one says, "Memorize this phrase, and now write it out again."

No they say "Write this word 10 times to memorize it" then give spelling tests.


I largely agree with this. There are some careers -- firefighting is one of them -- where it is downright dangerous for the person to have a reliance on technology (technology can help firefighters but the battalion chief still needs to know the entry/exit points on a structure to direct crews, the drivers need to *know* the city they are in, backwards and forwards to get to the site quickly (they cannot rely on Google Maps)), but I'm not sure how many of those careers (air traffic control would be another) would have remote-proctored exams anyway.

Having said that, I do think that many of us have become too conditioned to having access to Google/the internet/the ability to always look stuff up, that it does become a problem (I am utterly incapable of getting around the city I live in without Google Maps, as an example), and that, I think is a negative for humanity. Having access to information and reference is great, but never taking the time to actually learn/memorize/understand concepts and content so that you feel confident doing something without checking the answer is still really important in lots of roles. There should be a balance.

But none of this excuses the malware/spyware that ProctorU and other systems employ. Frankly, people who are that committed to cheating will cheat anyway.

I've heard of similarly dystopic sorts of bullshit even when it comes to remote job interviews, often (but not limited to) code tests. I cannot imagine any job that would be worth me installing spyware/malware on my personal machine for the interview. Even for a university exam, I would push back hard on this sort of thing (or at least the onerous requirements and access it has) or demand the school issue testing laptops or at the very least re-think how/why they proctor exams the way they do. Individually, students don't have a lot of leverage, but if enough people complain -- especially in high dollar degree programs like a MSc -- universities will rethink their behavior -- or at the very least, force the software firms they pay millions of dollars to to support a fucking Chromebook.


Firefighting is actually a really awful example. The practices involved require in-person practica, and there are stages of growth expected.

Additionally, for any structure of sufficient complexity, the "lead" firefighter will _absolutely_ be requesting and receiving building plans to direct the firefighters to the appropriate locations. Also, like London taxi drivers, fire truck drivers probably have to train on the streets they are protecting. (That said, I do recall stories of firefighters getting lost due to bad GPS directions.)


That’s exactly what I said. I said firefighting is an area where you cannot rely on just googling something. You need to know the information, period. I wasn’t talking about in-person or online tests, I was responding to OP who said that that style of testing where you can’t have access to materials is incongruous with how most things work.

Quoting myself:

> but I'm not sure how many of those careers (air traffic control would be another) would have remote-proctored exams anyway.


I am saying the exact opposite. The lead firefighter will use her profession’s equivalent of "google" to get the plans for a building that may be more complex (talking to the city planning department, looking in a city database, whatever). They will _not_ know the information by heart, and in a sufficiently large city…cannot.

The same applies to the drivers of fire trucks. They’re going to both be trained to drive around for locations and tested on that, but they will also use appropriate tools to get the location. It might be an "enhanced" GPS, but it isn’t going to come down to someone’s memory all the time.


I don't think that is an effective life strategy because you actually solve problems more slowly if you have less knowledge. If you've learned skills and committed things to memory, your brain can synthesize information much more rapidly. If you have to google everything, its like running your 2.5GHz PC at 10 MHz, and it is highly unlikely your brain will produce new information on the spot.

Edited for snark removal.


Following your lead:

Thinking is not "answering questions".

In order to build a mental model of a problem, you need a rather good model of the environment in your mind, and this comes from memory and the habit of thinking (which is what drill exercises facilitate).

Rote learning is an essential tool in being able to create adequate models of the environment.

The problem is lots of people think it is useless because "I did not need to memorize", when the issue is that "they are sufficiently able to memorize that they do it unconsciously". But most people need to put the effort.


For me personally, I just can't memorize stuff on demand like exams expect people to. My brain refuses to do that, period. But when I do use some information often and many times, I somehow end up memorizing it eventually. APIs, phone numbers, addresses and routes, even people's faces.


> I just can't memorize stuff on demand like exams expect people to.

Unless things have changed in the 30+ years since I was in college, I seem to recall exams happen after weeks of study and not "on demand". Part of learning is memorization. There's no way around that. It sounds more like you aren't cut out for the pace of high-caliber institutions which move very quickly. I have two friends with literature degrees, one from UCONN and one from Dartmouth. The Dartmouth friend had to read 4x the number of books than the UCONN friend in the same period of time for the same degree. C-students still need jobs! Just because you aren't a straight-A doesn't mean you can't still be a serviceable employee.


Just for context: I'm Russian, it all might be different in your part of the world. Anything education-related was hard for me exactly because of the pervasive "you can only use your head" mindset. You have to use formulas from memory. You have to do everything from memory, except maybe the teacher would write some constants on the blackboard. If you're lucky. In some classes in school (physics iirc?) we were allowed to use a calculator, but for math you definitely had to only make do with pen and paper. I easily grasp how stuff works, but this whole memorizing the exact intricacies or risk failing the test/exam is what gets me every dang time.


If I'm recalling correctly we changed our maths program to include lots of manual (calculus?) maths when the US gov wanted to churn out engineers to compete with Russia in space.

The need to manually solve the formulas has since been superceded by basic calculators. But it's still all (personal opinion) unnecessary rote equation solving these days. I certainly could've used more emphasis on structure and less on longhand math.

Good to know it's the same over there!


I don’t want my emergency room doctor looking up a fairly common disease on the internet. And I ding interview candidates who have to look up basic syntax like a for loop (I have no problem if they need to reference random APIs or advanced syntax though).

Being fluent in your professional vernacular is very important for productivity. And a closed test is the best, scalable way we have to test this.


You won't succeed on any advanced test, with or without reference materials, if you haven't mastered the underlying basics. There would be too much to look up. You can't simplify fractions in a reasonable time if you need to look up everything in a multiplication table, for example, and it's true for everything that builds on that. It's the same for any topic: there comes a point where you can't succeed on Google alone.


You can Look up Mathematical induction but if it's the first time to solve one in the exam you won't succeed.


I've been coding for 30 years and I still have to look up syntax for case/switch statements. You ding me?


Seriously, that's just not super valuable information to have stored in your mind like that.

Sure I could, it'd be fine, but why store the entire syntax structure in my mind rather than a pointer to the location where I can find it in its most updated form?


I'm sorry, but what?


They have been coding for quite a while and are saying that they still look up the syntax for case/switch statements, because they cannot remember it off the top of their head. I am in the exact same boat, except with about a third of their experience coding.


I would definitely ding someone and think less of them if they claimed to be a senior engineer but couldn't remember how to write a switch statement in Java/C++. If you have to look things up you work slower and you don't understand nuances of e.g. what types can be used in switch statements in what language which limits your design skills.


Do cases fall through? Do I need to have them sequential? Can I put additional conditions on it? What if I have a case multiple times?

There is a surprising amount of complexity behind switch/case.


Shoot, some syntaxes haven't even been around long enough to have a for-loop documentation page hit the front page of a google search of "{language} for loop". As an agency developer I've got like 6+ languages/markup syntax knowledge required of me any given year, they just keep making new ones and I have to keep learning them.

My current requirements, just this month, are:

  * Python
  * PHP
  * Javascript
  * React and Vue
  * TypeScript (but not in every JS project!)
  * Blade templating
  * Liquid templating
  * Whatever Algolia uses for it's inline templating
  * Bash
  * C# (games!)
And if you want you could throw in arbitrary configuration syntaxes for all the infrastructure as code insanity that comes with it all, some of those have loops.


Sure, but why is someone asking you to use Algolia templates in an interview?


It was just to illustrate how much arbitrary knowledge is required in modern stacks. We could be talking about totally different industries though, perhaps you're hiring senior C developers and it'd be weird if they didn't know the syntax yet.

For apps and web development, I wouldn't fuss too much about specific language syntaxes since if I were to cull my applicants based on language experience I'd be throwing away plenty of talented developers. I'd hire a talented Python developer for a job writing JS for example, if they're applying then they're willing to learn it.


Fair enough. In my interviews I tell candidates to code in the language they’re most familiar with. Hopefully someone has memorized a for loop in their go to language.


Aha, that's a pretty fair approach then.


I assume the doctor is having all your symptoms recorded, are you against that being automatically put into an expert system to list possible diseases? One that knows the failure rates of tests so can give more accurate chances of having X Y but not Z is actually something that has X Y and Z, but the Z test is only 90% accurate?

By scalable you mean cheap.

An actual test would be: design a whole system/experiment to do a thing, in 3 hours.

Can't look up basic things, you'll be too slow.

But then it can't be auto marked, so isn't cheap.

Exams: The very last time in your life you won't have internet access.


Doctors google rare disease symptoms sometimes, however, and it's seen as a legitimate thing to try if you're truly stumped by a mysterious illness.


And in the days before the internet, they likely had a bookshelf full of reference manuals.

True story: when I became a programmer in the 1990s, I used to buy the printed manuals for the Java APIs as they were released. At one of my first interviews (for a Perl job), I was given the Camel Book as reference for the (handwritten) programming test.


And just as often they don’t. I went to the ER for a seizure, and basically after paying $5,000 the advice I got was “well that drug has seizures as a side effect, so stopping take it and find a new one.” Absolutely no investigation in to why this happened, if there’s a class of drugs I should avoid, nothing.


The business world has a jargon term, "solutions provider". The idea is that they strive to not just provide you a tool, but to fully solve your problem, to make it entirely go away. It's often just a buzzword, but like most buzzwords, there's a kernel of useful truth in the middle of it.

The medical industry is not a solutions provider. You should view them as a useful tool, but one that still leaves you with the responsibility to utilize the useful tool to solve your problems.

I am not making a normative claim here; I'm making a descriptive one. The medical system is an incredible toolset, but you need to be ready to assemble it into a solution. Maybe it should be a solutions provider. Maybe it's really discriminatory against the people who won't or can't operate this way. No argument. But it observably isn't a solutions provider today, whatever "should" be.

In this particular case, if you care you should have scheduled a followup with a different doctor. ER doctors don't do that sort of analysis.


Huh, sounds like you might have been better off if "[your] emergency room doctor [looked it up] on the internet".


Well I can’t find any information on the general internet about this, beyond what the doctor said. One would most likely have to synthesize several different areas of knowledge to come up with a plausible hypothesis.


Then you can test on time. If it is expected to remember the term without internet give them few second after showing the question. Remembering the answer in 30 seconds is not better than being able to google it.


> I don’t want my emergency room doctor looking up a fairly common disease on the internet.

That ship sailed a long time ago.


I took the approach of just letting my students have unfettered access to the internet during the pandemic, even for exams. You may think this would lead to everyone getting an A, but it turns out the grades were normally distributed with a B- average, which was pretty typical for my courses per-pandemic.

What I did was ask them to write real code. It’s a course on programming. If you can get the computer to accomplish the given task in whatever way you know, including gluing together parts and libraries from online, then that’s a demonstration other learned something!

And as it turns out, C and D quality work is still submitted even when students have all the time and resources in the world. The internet is not a magic wand to solve all problems for you. For example, my final project is to write a file server. Yeah you can find tutorials for how to do this on the internet, but it’s still going to have to be customized to use my specific protocol. For someone who knows what they are doing, they can do this assignment in 10 minutes leveraging the right tools. For others, they might write it from scratch as we discussed in class. Others still will barely get past writing a makefile, even with step by step instructions and examples on how to do it. So in the end the grades worked out as they usually do, distributed normally with a B- average, and about half a letter grade stdev. That’s just the way it is.

So this fear that access to the internet will lead to rampant cheating is only relevant if the questions are easily gamed: like definitions, facts, and contrived math problems. Well it turns out this is like 90% of prepared materials for the low level intro classes.

I mean, if you can get through a number of semesters of college by looking up answers on Google, the problem is not with Google, it’s with the curriculum. Because what value are these courses really adding if they are just cramming facts into your brain and then you regurgitate them, and promptly forget?

Take chemistry for example. All of chemistry should be in a laboratory setting. All of it. It’s a perfect hands on discipline, yet the standard sequence is just year after year of memorizing facts and definitions with very little comparable lab time.

And I get why this isn’t done, it’s probably not practical and there does need to be an acquisition of fundamental knowledge, but that isn’t really done by going through prepackaged web content (that can’t be resold) with all the answers available on Chegg, as is the case for a great many 100 and 200 level courses in all major disciplines at most universities.


>Take chemistry for example. All of chemistry should be in a laboratory setting. All of it. It’s a perfect hands on discipline, yet the standard sequence is just year after year of memorizing facts and definitions with very little comparable lab time.

I mean this is just... Incorrect. High school chemistry (generally the first time it's called chemistry and not science) is often taught in a lab (I assume mainly a function of whether lab facilities are available). Undergraduate general chemistry and organic chemistry generally have a required lab component that is required to be taken concurrently (this is even true at community colleges). Using general chemistry as an example, the lab portion is usually 1 three hour (at least) class per week, while lecture is 3 one hour classes per week (organic chemistry is essentially the same). More advanced chemistry might not have lab components (e.g. physical chemistry), but the foundations are almost always at least combined with a lab.


I mean all of it, even all the way down to the math requirements. I envision a chemistry degree with zero sage-on-the-stage, or maybe in this case, sage-in-the-page style content delivery.


Ok... I was responding more specifically to...

>the standard sequence is just year after year of memorizing facts and definitions with very little comparable lab time

And I compared the lab and lecture time... and they are very similar (essentially equal)...

I didn't choose to respond to your overall ideas about a chemistry degree because that would require convincing you that there is quite a bit of fundamental information needed to make those labs useful (and efficient) and there was already a great misunderstanding regarding the amount of time chemistry students spend in the lab.


> that would require convincing you that there is quite a bit of fundamental information needed to make those labs useful

Well you don't need to convince me of that since I admit as much in my original post!

> and there was already a great misunderstanding regarding the amount of time chemistry students spend in the lab.

I can only speak to my experience advising dual CS and Chemistry majors. The ratio of lab to class hours for them is 1:2, and worse when you add electives. So that's the basis for my statement. YMMV.


>Well you don't need to convince me of that since I admit as much in my original post!

Ok, I guess in your degree program the only difference is that you sit around in a lab instead of a lecture hall learning those fundamental things then? Because I'm not referring to some trivial amount of fundamental information... I'm referring to, roughly, the amount of information covered in 3 one hour lectures a week.

> I can only speak to my experience advising dual CS and Chemistry majors. The ratio of lab to class hours for them is 1:2, and worse when you add electives. So that's the basis for my statement. YMMV.

Ok, I'm citing directly from my university's course catalog (and also the courses I took at several universities) so YMMV.


> If you elaborately cheat your way through all online classes all the way through undergrad, that seems mostly bad for you.

As a teacher, I would love to be able to dispense with tests except as informative diagnostic tools. Unfortunately, I am required to give at least a final exam by my university, and that and other grade-based requirements come from the confounding of what the university's mission should be, which is education of the interested, with what most (at least US) universities' missions have become, which is credentialling (of the often unwilling—and who can blame them, when the requirement is artificially foisted upon them?).

Unfortunately, there doesn't seem to be any way to get away from that drift towards credentialling that doesn't involve a shrinking student body, and administrators are addicted to the idea that we so badly need an ever-increasing student body to prop up … well, salaries for administrators, for example, whom we clearly need because without them we might not do so well on this endless treadmill of pointless dedication to growth.


Life is an open-answer problem, not a sequence of questions, so the "answer" is neither unique nor necessarily easily expressible as a sequence of statements.

Memory is useful because it is the very basis of reasoning. Rote learning is important because intellectual habits facilitate thinking.

Google has not all the answers, but especially: it does not have any questions.


I remember the tests in university where everything was allowed (books, notes and sometimes even internet). They were feared because it meant the topic probably would be a difficult edge problem. It was only the good and engaged profs that did these tests though, guess it is a lot of work to come up with new questions.

Everything allowed Math I + II ate the dreams of students like candy.

Still, even if you are good at math and have all the tools at hand, you still had to develop a general understanding of the topic since otherwise you would just be way too slow to work through the assignments.


In my mind (I know nothing about the universities' threat models) the biggest concern is not looking up some reference online, but paying someone for a couple of hours to take the test for you.

I agree, if you can look at available resources and competently perform the task that is likely not an issue in the field.

However paying someone to tell you all the answers to a test is cost effective for an exam that takes a couple of hours but will not get you by in the field. The university doesn't want to be known to give out degrees to incompetent workers so they need to prevent this case.


It's mostly bad because it'd be you cheating yourself I guess.


Absolutely. I work at a well known tech company, and when we give potential candidates coding tests we give them the option to code in any language they'd like (including pseudo-code), and to use Google if they can't remember the name or syntax of something they want to use.

There's not reason to fail a candidate just because they can't remember the signature of a method in a library they haven't used in a while.


> and it seems like testing and certifying doctors and lawyers is important enough to society

Lawyers spend most of their (billable) time researching. And the great majority of lawyers never see a court room. No quick reaction time needed.

Doctors... The average doctor follows a script to triage the easy 80% of problems. And the average doctor stops at that.


Doctors and lawyers frequently consult their references.


Proctor-spyware is also famously biased: https://library.auraria.edu/news/2021/why-online-test-procto...

I opted out of using spyware in the university physics courses I taught last year, and caught my cheating students the old fashioned way. Proctor-spyware, like airport security, is more about theater than effectiveness. You aren't giving the USMLE or a Bar Exam, so you can take the time to write a good exam and evaluate it correctly.


> “It’s become clear to me that algorithmic proctoring is a modern surveillance technology that reinforces white supremacy, sexism, ableism, and transphobia. The use of these tools is an invasion of students’ privacy and, often, a civil rights violation."

Must all of the cards in the deck be played at each turn? Things cannot simply be bad/user hostile/privacy invading, etc.


It's important to note that none of these things are necessarily done deliberately (though "white supremacy" is perhaps a bad way to express "racism that benefits white people specifically"). Other than transphobia, either the linked article, or the letter linked in that article provide evidence for all of the accusations. Facial recognition software that doesn't handle dark skin well intrinsically treats different ethnicities differently, in this case disadvantaging non-white people. Many of the markers for "suspicious behaviour" that are used to detect cheating are also present in people with both mental and physical health conditions. Dealing poorly with headware has an outsized influence for non-white women (who are likeliest to wear headwear that obstructs the face).

Again, I wouldn't chalk any of this up to deliberate bias against any of these groups, but it's all bias anonetheless.


This is speculation, but many transgendered folks tend to present gender in an non-traditional way, as well as people in the midst of transitioning that may be "in between" presentations in a traditional sense. If you only train on cisgendered faces, you may only be training on sort of "default" gender presentations, facial structure, and the like, which may give a similar disadvantage.


I can totally 100% see how transgender people might trip up some of these things. For the point of this discussion, though, all the other biases had a specific example in the linked article, whereas that one didn't.


[flagged]


White supremacy is the idea that white people are inherently better and more capable and therefore more deserving. Hence, for something to be white supremacy, it doesn't just need to have a bias favoring white people. It should also have a justification of this favoring based on white people deserving better or being better.

Racist systems that have encoded societies biases are generally not white supremacist. The 'justification' for those systems is often things like "this is just the way it is" or "this was easier to do like this" or "I went with my own experience".

These days a lot of racism does not come from white supremacy. It either comes from something like familiarity bias of people in power, or from following the status quo mindlessly. Calling those acts white supremacist can be dangerous. It allows the real white supremacists to hide among the unknowing. It also pushes people who unintentionally did something racist way into the defensive if you tell them they are white supremacist. And pushing people who unintentionally did racist things into defending their actions is not going to make things better.


> White supremacy is the idea that white people are inherently better and more capable and therefore more deserving.

That's a common dictionary or encyclopedia definition of the white supremacy. More broadly, white supremacy also refers to the systems and structures of power that are built into most of "western" (a better term might be post-colonial) societies that favor both white people, and people who support or uphold the balance of power in those post-colonial societies.

> Racist systems that have encoded societies biases are generally not white supremacist

I agree, however I think that those racist systems that are not inherently white supremacist in nature are largely rooted in non-colonial countries (basically countries other than the European colonial powers, and the countries that grew out of those colonies).

> It also pushes people who unintentionally did something racist way into the defensive if you tell them they are white supremacist. And pushing people who unintentionally did racist things into defending their actions is not going to make things better.

That is just not true. If someone does or says something racist, they can and should be challenged on it. If they become defensive, there are multiple reasons that could happen, but if the reason is that they simply didn't know better, it's just the way their society is, or if the reason is that they are opposed to "wokeness" (which is a catch-all for intersectionality, critical race theory, and many other modern perspectives and ideologies that are largely centered on dismantling power structures and reducing bias and discrimination), then it's likely that they are supporting white supremacy out of ignorance (whether that ignorance is from being uninformed or uneducated, or the more malicious willful ignorance of people who choose to use or engage in racist norms because they are opposed to "wokeness" from an ideological or other perspective).

Pushing people who do things that could be cast as unintentionally racist is the only way to a) educate them, so they can do better, or b) determine if it was an intentional act. I know this from practical experience, and it was only from going through the hard and painful experience of being called out on harmful "unintentionally" racist jokes and behaviour that I learned to do better after being raised by a family that had (and for the most part, still has) some pretty racist and discriminatory views.


Lets not devolve into a semantic discussion about white supremacy. Under your definition I agree with a lot of what you are saying.

I still think we should not tell people who do racist things "you are being a white supremacist". Instead we should say "that thing you did was a bit racist" or even "that thing you did could be seen as racist". We should _definitely_ challenge those people. But if you are trying to get someone to change their mind or behavior, you gotta be real careful about their feelings.

Calling someone a white supremacist is going to hurt their feelings quite a bit. And that isn't going to make them more likely to consider if they should change or did something wrong. That doesn't mean don't challenge people on their actions. That means challenge people on their actions very gently.

If we don't treat these people gently, then we lose the less introspective part of the population. They start getting reactionary, start 'banding together' against these annoying people. They start 'fighting back in the culture war'. Basically, getting this wrong is how you create republicans.

All of the above means I think we should be considerate in how we challenge racists. And I think part of that involves being really careful about the term white-supremacy.


> the systems and structures of power

what a vague concept lumping together unrelated things

the systems of power in one place are different from another place, they are unrelated, can't be reified like this


There's an old saying: "It is difficult to get a man to understand something when his salary depends upon his not understanding it" (often attributed to Upton Sinclair, not sure if that's true or not tho).

Id like to suggest that what you are calling "familiarity bias" might have a component of the quote in it too. Not salary in this case, but social position. That is in the racist system, one race of folks get better treatment, and if they want to maintain better treatment, the status quo must be maintained. The group of people at top of a racial hierarchy (that is in the supreme position), are incentivized to keep the racist system. When race is considered a bad reason to judge a person, they still are incentivized to maintain the system, just find different words to justify the status quo.

I guess a different way of saying this is - white supremacy describes a race based social hierarchy where white people are at the highest level. It has also been used to describe the lowlife Nazi or KKK wannabes that advocate for it in the baldest terms, but they are bigots who advocate for white supremacy using racist terms like "inferior genetics" or worse.

Compare the term racist itself - there are folks who would have you believe that the term is limited to personal bigotry against people of a different race, and has nothing to do with the rules and actions of systems (a position I think you don't hold due to your description of racist systems).


I'm not handwaving anything away. It's just that I don't think all racism is the same.

White supremacy, to me, implies explicit, militant, proactive racism. People who might very well be proud of the fact they're racists.

This story is about a software system that (among many other issues) doesn't work well with darker skin tones in low light. Especially in light of all the other failure modes, I'd ascribe that to carelessness or indiference, mixed with pressure to reduce false negatives at the expense of more false positives. I wouldn't be surprised if training data and/or testing were filmed in an office setting with the amount of light you expect there, and they never ran across the issues with dark skin interacting poorly with the amount of lighting a student would have a home.


> White supremacy, to me, implies explicit, militant, proactive racism. People who might very well be proud of the fact they're racists.

That is only half of the story of white supremacy though. The other half is the entrenched systems and biases baked into those systems that largely benefit white people, and that train people, through experience, to prioritize preserving the existing systems and status quo.

Not considering the fact that there is a well documented history over the last 20 years of tech companies and business in general prioritizing the experiences of the white majority, at the expense of people of colour, is largely the reason why you can "wouldn't be surprised if training data and/or testing were filmed in an office setting with the amount of light you expect there, and they never ran across the issues with dark skin interacting poorly with the amount of lighting a student would have a home.", and not consider that being the norm, or even acceptable as being indicative of white supremacy.

Those biases may not always, and only impact people of colour, but they do overwhelmingly benefit white people. That's the entire point of the article that OP shared, and the references the author of that post uses to back their claims.


It seems clear that a dev team could whip up a product that tested well (using folks in office, family members, friends, etc.); was trained with datasets that - for whatever reasons - weren't sufficiently varied; and hit some mark of success and pushed it out the door to refine the rest later. It also seems clear that the resulting product could do poorly when recognizing black skin, due not to ill intent but lack of polish with the resources on hand.

But something I always wonder when accusations like "white supremacy" are thrown around: is it falsifiable? What evidence would dissuade you from that?

- What if both ends of the spectrum do poorly and extremely pale people have problems, too?

- What if the threshold is dark black and lighter-skinned black people, Asians, Middle Easterners and other non-white people are able to use it successfully?

- What if only a narrow band of light levels work, making it clear their testing range was generally too narrow, not just in skin color?

- What if they took care to incorporate black models in testing, but the photo quality (and their own in-house cameras and lighting) overestimated the quality of most home users'?

And what of the myriad other things that were done poorly in the software: limited OS support, bugs, excessive memory usage, overall intrusiveness, browser limitations, disallowed mobile devices, lack of multi-monitor support? Do they likewise arise from systematic oppression of some group? What if we dig in and find that white people are more likely to use iPads, Linux, and multiple displays?

Most often these accusations flow in only one direction, and that all other flaws or problems are taken to be simply happenstance and noise. Certainly anything that impacts white people negatively will not be automatically seen as anti-white, although in a world with activist devs, such a result isn't incomprehensible.

Claims of white supremacy (among other accusations of character) are thus, to my mind, wildly speculative and carry a very heavy burden of proof.


I think "white supremacy" tends to imply direct, explicit bias, and may sort of exclude the built-in "unrealized" biases that exist in the current culture, where white supremacy was the foundation but not necessarily explicitly imbued.


> how is it even possible to handwave away systemic racism and bias "that benefits white people specifically" as anything other that white supremacy?

Your way of grouping people by race is kind of arbitrary. It puts together rich and poor when they only share a skin color. How is white supremacy working in Bulgaria, for example?


[deleted]


Yes, I totally get that, unfortunately I accidentally left out part of the first question in an edit :(


Ah gotcha, deleted the comment.


Don't you normally expect people to make the strongest case for preventing something? It's pretty common for a lawsuit to bring every claim a lawyer can come up with on the hopes that enough will stick to get the outcome they want.

I would especially consider that in the United States we do not have a broad legal right to privacy but there are potentially much stronger tools available if that software skews negative outcomes towards protected categories like sex, disability, or race. From the perspective of a student, job applicant, etc. being asked to use this, if the legal risks cause an organization to stop using it they'll enjoy that as a win even if the outcome isn't a blanket ban.


>Don't you normally expect people to make the strongest case for preventing something?

Not at the expense of grounded reasoning. When I see poorly substantiated claims, it shouldn't, but it really drags the whole rest of the argument down. The argument presented about lawsuits is actually a great example of why I think that's a broken system. They resolve that issue of lost credibility by considering each issue with total, clear, and mandated separation. Outside of that legal world with very well-defined rules, using such tactics reduces credibility.

I should note, that in this particular case, the claim of racial biases is at least substantiated by a believable anecdote.

EDIT: To clarify why I think the legal methodology is broken, its only because the same principles apply to criminal trials. IMO, prosecutors should NOT be throwing poorly substantiated charges at a defendant just to increase their winning probability and make the required defense more expensive.


> When I see poorly substantiated claims, it shouldn't, but it really drags the whole rest of the argument down.

It should. If someone is willing to make wild, unsubstantiated, claims, it should detract from their credibility.


You're talking about a letter from a U.S. Senator citing published reports in e.g. MIT Technology Review and concerns raised by professional organizations. I think dismissing that as “wild, unsubstantiated” would require at least some discussion of the linked claims.


Why should I reference the particulars of the link when refuting a generalization?

If the parent comment had said something to the effect of "I should give the benefit of the doubt to a sitting US senator" then you'd have a point, but that context wasn't part of their statement.

Edit: Also, frankly, I wouldn't give the benefit of the doubt to a US senator. If anything, it makes the identity politics feel even more irrelevant.


Okay, trying engaging intellectually with the reports rather than just reacting to someone’s one sentence summary of thousands of words. It’s kind of hard to see any definition of “identity politics” which includes the reports but not your emotional reaction to accurate words.


> Okay, trying engaging intellectually with the reports rather than just reacting to someone’s one sentence summary of thousands of words

I wasn't reacting to a summary, I was reacting to an independent premise. The sentence I reacted to, at least the way I read it, was broad to the point of being more of an axiom with which the commenter interpreted the post. As I see it, disagreeing with an axiom is an intellectual engagement. Feel free to counter my disagreement.

> It’s kind of hard to see any definition of “identity politics” which includes the reports but not your emotional reaction to accurate words.

Not entirely sure what you're saying here, but none of what I said was emotional. Simply pointing out that pattern matching is a viable way of filtering other people's thoughts and ideas. If someone makes wild claim, it should change the way you view other claims which may have seemed more rational in their absence. Not really an emotional statement in my mind, but again, feel free to point out which part of this you disagree with and I'd be happy to engage.

As it stands now you aren't really responding to anything I've said, but rather disagreeing that what I'm responding to warrants responding, which is rather tangential.


If you think the accusations are poorly substantiated, then make that case, rather than just complaining that too many accusations were included in the same paragraph.


I don’t have to. Using a Gish gallop of bad arguments doesn’t impose some moral imperative on me to prove every single one wrong.


No, but I think that raises the question of why you think the quoted bit above is a bad argument. It tracks with various reports I've seen linked in HN on this topic over the past couple years.


I know that 95% of the time I see a Gish gallop of identity politics, it's not an argument to engage in at all because even when you do, you're called racist yourself unless you subserviently agree with every aspect of the argument. The identity politics argument is often a tempting one because it allows people to act righteously indignant and feel powerful.

Case in point, here's someone saying, 'How can you even handwave away systemic racism?' in reply to a comment agreeing that bias exists but is not deliberate: https://news.ycombinator.com/item?id=29164295

Also, the identity politics argument seems to hinder a simple, moral argument against surveillance software as a violation of privacy. The logical implication of the IP argument is that this surveillance software would be okay to use if we manage to work out all the kinks.


I have found that two to three strong points far out weight a list of 5-10 weaker points. This extends to the case when the original two points are included in a longer list.


The difference is that lawyers act in a structured environment with specific rules on how things should be considered - you bring up all the possible claims because if any of them get thrown out, it doesn't impact the others.

It's not the same with general discourse - when you raise a bunch of issues that aren't especially relevant and seem designed to be inflammatory, you damage the credibility of your other arguments. Arguing that test proctoring software is transphobic is such a stretch that it makes you question whether they author has such strong biases against the software that their evaluation of it is just generally too biased to be trustworthy.


> Arguing that test proctoring software is transphobic is such a stretch that it makes you question whether they author has such strong biases against the software that their evaluation of it is just generally too biased to be trustworthy.

It seems like the software is matching people against existing images, based on the issue with the black student, and trans people are I would assume more likely to change their appearance, including as a result of taking hormones and having facial feminization/masculinization surgeries.


> Don't you normally expect people to make the strongest case for preventing something?

Yes, but with some evidence. Otherwise, as a society, this is a bad direction to head in.



I mean, it's an article about how a technology is negatively impacting marginalized groups, and cites research that backs those claims. It makes sense to cite the groups and practices most impacted by it.

You could read the research and refute it, or you could just bluster about things. I know you tried to expand on it in your comment below, but minimizing the specific concerns raised to "isms" and ignoring that at least two of the references in the articles linked and their references for the actual research addressed at least the socioeconomic portion of it, illustrates that you only applied your surface level perspective and criticism.


Things are bad in particular ways - software that is more likely to falsely punish people of color does support the continued dominance of white people in a real, material way. Noting how things are bad seems useful and important to me.

That being said, there's a rotating list of "badnesses" that are in the zeitgeist and I agree that it's annoying to see them flogged at every opportunity (often w/o much insight).


"software that is more likely to falsely punish people of color does support the continued dominance of white people in a real, material way."

The statement is at best extremely misleading, and at worse, mostly false. It also represents a juvenile, immature, and myopic perspective on reality.

I would suggest reading "Wealth, Poverty, and Politics" or "Discrimination and Disparities" by Thomas Sowell. He has been debunking the "inequality of outcome therefore racism" logical fallacy for decades.


Where did I say that unequal outcomes must, necessarily, be caused by racism? What is misleading about what I said?

It seems like you're reading a specific thing I said about a specific scenario and universalizing it in a way you imagine I might universalize it. I'd love to hear a critique of what I actually said, or we could talk about our views of society in a wider way, but I can't respond to this combination of generally dismissing what I said and attacking what you imagine I might think.


I stand at least partially corrected. I am not familiar with biases in facial recognition software but it looks like a real thing in some cases, caused for instance by lack of diversity in training data sets.


Software that's way more likely to fail you literally because of darker skin sounds way more like "inequality of opportunity" to me.


It’s more likely to fail because…physics. The sensor on your webcam is only so big, and can only capture so much light. Darker faces require more lighting to capture details. Photography isn’t racist, it’s physical limitations that come into play.


> Photography isn’t racist, it’s physical limitations that come into play.

I wanna pull this apart a bit because I think it's a good opportunity to talk about how systemic bias gets started. Digital sensor evolution is path-dependent. Technologists developed photo-sites that have "enough" dynamic range for most uses before moving on to increase the resolution on a sensor. What exactly is "enough" depends on your test data.

The sensor on a webcam is only "so big" as you say - but how that sensor balances resolution and photo-site count depends on what conditions they consider acceptable. We could build web cams that would see more pigmented faces better - there is no fundamental limitation in the technology itself. It's that a series of decisions have been made over years of development, generally without people thinking specifically about race at all, and we've arrived at a status quo that has adverse outcomes for people with different skin tones.

There was a similar process that happened with film photography[1]. Not that film, as a technology, is unable to capture dark skin - but that the development standards that were tested and distributed were designed for lighter skin.

Like, I agree that the webcams we have aren't intentionally 'racist.' But I do think that the status quo that has led everyone to accept this balance of dynamic range and resolution is reflective of valuing people with lighter skin more.

[1] https://www.nytimes.com/2019/04/25/lens/sarah-lewis-racial-b...


A closely related story that will also resonate with you: early microphones were tuned for male voices, which led to truly a lot of harm to women. It's just a matter of physics, as people are fond of saying. Mediocre technologists make mediocre products that were only validated for people like them.

https://www.newyorker.com/culture/cultural-comment/a-century...


A) The major issue is not the sensors, it's the lack of emphasis of ML training data for darker skin.

B) Even if it were simply physics at play, requiring the use of a system known to have physical constraints against darker skin causing failing grades purely on that metric is still pretty racist.


It’s hard to make people care with “bad/user hostile/privacy invading” because those terms have saturated descriptions of behavior that users are okay with. Example: tons of articles mention FB as a privacy invading or user hostile service but it continues to be used by people who don’t really care. Using the same terminology for something that is arguably worse with much higher stakes (algorithmic proctoring that “reinforces white supremacy, sexism, ableism, and transphobia”) is appropriate because it gets the reader to care by illustrating exactly what is possible with algo proctoring.

I sense you’re tired of discussions that mention the “cards in the deck”, likely because you aren’t affected by them and therefore care little for them. That’s honestly fair, but there’s value in writing that way to channel outrage into action.


[flagged]


But extrapolating from one HN commenter onto the entire zeitgeist is acceptable to you?


Direct question. Do you agree with the aforementioned commenter and why or why not.


Is your criticism that these accusations are false, or just that they included too many true accusations in the same paragraph?


My criticism is that including a laundry list of "isms" is a polarizing, low-effort rhetorical device that elides a lot of nuance.

White supremacy: darker skin tones tend to photograph worse than lighter skin tones. Laptop webcams are notoriously crappy and can make this effect worse. Is this white supremacy or physics? Can the test instructions be modified to ask all users to have an appropriate lighting setup (ie lit from above and from the side to ensure that your face is foregrounded properly)

Ableism: for users who require assistive technologies, is it better to take a test in their own space with their own equipment or to travel to a test center and use shared lab equipment? For users with mobility challenges, is it better to be in their own space or travel to a potentially non-accessible testing center?

Sexism: for working mothers, better to take a test in your own home or travel to a testing center and arrange for child care?

Two "isms" seem more relevant but weren’t mentioned: ageism - because fuck boomers, right? Socioeconomics - not everyone has access to a PC that meets the specifications or can acquire one on short notice.

Lots of context is discarded when one engages in polarizing categorical rhetoric. I’m not here shilling for proctoring software but rather for nuanced discourse.


> White supremacy: darker skin tones tend to photograph worse than lighter skin tones. Laptop webcams are notoriously crappy and can make this effect worse. Is this white supremacy or physics?

I’m not a photography expert, but it seems to me that cameras are physically equally capable of overexposing an image and underexposing an image. If a particular camera which is used in a facial monitoring system tends to do one rather than the other, I would ask why that is the case.


Doesn't work like that. Overexposing (with the same light) means having worse SNR due to higher gain.

The choice to use typically terrible cameras in a proctoring system disregarding that it might work ever worse than normally for a subset of people is suspect, yes.


But wouldn't it be just as easy to create a cheap camera that tends to correctly expose dark skin in normal lighting conditions, and cannot dial down exposure enough to prevent light skin from being blown out in normal lighting conditions? If that's possible but cheap cameras tend to not work this way, then why is that not the case?


Yes and no, but mostly no. Lighter skin reflects more light (which means more signal), so it's inherently easier to image (it would be harder in extremely intense light, but getting fast shutter speeds is a lot easier than dealing with not having enough photons). Auto-exposure algorithms tend to work more accurately on lighter skin, too, which could be improved but is generally not something implemented at the hardware level in a webcam as far as I know (software can ask for different iso sensibility and shutter speeds).


The "isms" referenced are less about the fact you can work around these complaints line by line and more about the fact no one bothered to check them before rolling out required surveillance technology for education.

White supremacy is not just bad faith things done by bad people, it is also the assumption that whiteness is the default experience and the failure to account for that not being the case. Similarly with ableism and sexism.

That being said, complaining about the accessibility & inclusiveness of our required surveillance technology does have a dystopian feel to it, lol.


> White supremacy is not just bad faith things done by bad people, it is also the assumption that whiteness is the default experience and the failure to account for that not being the case.

This only works with "new" definitions of racism. It is in fact plainly racist on it's face to demonize a group based on immutable characteristics. It is even worse when actual diversity of though is ignored and people of color are demonized because they don't agree with a race-marxist ideology.


I'm not sure I 100% understand what you are reacting to. I want to understand these two points a bit better:

  It is in fact plainly racist on it's face to demonize a group based on immutable characteristics.
Yes. Where have I, or the study's author, done this?

  It is even worse when actual diversity of though is ignored and people of color are demonized because they don't agree with a race-marxist ideology.
Yes. Where have I, or the study's author, done this?


Would these webcams have shipped if you couldn't see white faces well on them? I don't think they would, and the test proctors would not think of requiring one that didn't. Both the camera makers and the proctors are putting in an assumption that these products are only/primarily for white people, and thus discouraging non-white people is not an issue.


It's not news that facial recognition is best at white male-presenting faces and bad at all the others. ProctorU is also pretty hard to use for the differently abled.

I think you just reflexively dismissed this because you saw a basket of words that normally go along with things you disagree with, but the argument is pretty solid.


> It's not news that facial recognition is best at white male-presenting faces

Really? On the male part? I'd have expected it to do best with women, because I'd have figured facial hair is more difficult to deal with than a wider varieties of hair styles.


When they break even worse when you have black skin, it certainly sounds like a civil rights violation.


It really doesn’t.


> Things cannot simply be bad/user hostile/privacy invading, etc.

In this case, the algorithm is actually bad for all ethnicities [0]. It's just that it's extremely bad for black students (fails half the time) and just regular bad for everyone else (fails a quarter of the time).

[0] https://www.theverge.com/2021/4/8/22374386/proctorio-racial-...


I followed the references to find the actual argument behind calling surveillance proctoring all of the above, and these are the relevant bits[0]:

>At the beginning of a test, these products ask students to verify their identity by matching their appearance with a photo ID. As Os Keyes has demonstrated, facial recognition has a terrible history with gender[x]. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity. If a student’s gender expression or name on their ID are different from their current gender expression or name, the algorithm may flag them as suspicious. When this happens, they may have to undergo another level of scrutiny to authenticate their identity, an already common and traumatic experience for trans and gender non-conforming students. If these students are not alerted of this possibility before the test begins, it may force them to either discontinue the test and risk their grade, or out themselves to their course owner when they may not want to, risking more trauma and discrimination including being denied financial aid, being forced to leave their institution, or have their lives put in physical danger.

>The Eugenic Gaze is a combination of white supremacy, sexism, ableism, cis/heteronormativity, and xenophobia. When we apply the Eugenic Gaze using technology, the way we do with algorithmic test proctoring, we’re able to codify and reinforce all of those oppressive systems while avoiding equity-based critiques because of our belief in the neutrality of data and technology.

Their recommendations are quite reasonable:

>Don’t use algorithmic test proctoring. Instead, focus on pedagogical techniques that you can use to design assessments, online or in person, that draw from personal experience or require students to apply concepts in unique contexts. If you have to use algorithmic test proctoring, make sure students know about the test settings and ID requirement well before they take a test, and assure them that you will not take any behavior flagged as “suspicious” into consideration that isn’t described explicitly in the syllabus.

The GP link[1] instead calls out "Facial Recognition Tech", and "Algorithmic Proctoring" as being too biased and follows up with a petition[2] to ban these entirely.

[0]: https://hybridpedagogy.org/our-bodies-encoded-algorithmic-te...

[1]: https://library.auraria.edu/news/2021/why-online-test-procto...

[2]: https://www.sheaswauger.com/post/petition-to-ban-facial-reco...

[x]: https://ironholds.org/resources/papers/agr_paper.pdf "The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition"


Which one of these is inaccurate?


The claim of "transphobia" is extremely weak, for one. Clicking through the maze of links, the argument appears to be that because algorithms that attempt to guess gender based on photos sometimes guess wrong when trans people are involved, all attempts by computers to look at faces must therefore be transphobic, even if those computers do not attempt to guess the gender of the person whose face they're looking at.


Misgendering people intentionally is transphobic, and systems of this nature behave this way. For many trans people getting an ID to correctly reflect their name and gender is extremely difficult and can go for months/years without it being corrected; companies like Proctor U don't care about the problems this causes: https://www.teenvogue.com/story/exam-surveillance-tools-remo...



The quoted person expands on their arguments on those topics in the linked article. If you think they're needlessly "playing a card" why not engage with the actual argument?


[flagged]


You're anonymous online. You have the ability to create a throwaway and refute the actual arguments in the article all you want with literally 0 repercussion. Honestly seems more like you're chasing your own version of woke points.


Ok Fine.

> It’s not clear to me that algorithmic proctoring is a modern surveillance technology that reinforces white supremacy, sexism, ableism, and transphobia. The use of these tools is an invasion of students’ privacy and, often, a civil rights violation."


b) is untrue. It's pretty clear when fake accusations are BS and you don't have to be "labeled" to call them out.

It does require you to engage with people directly.


Im calling out this accusation as fake and unsubstantiated.


You're not being accused of anything, so whatever's fake is in your head.


Clearly you are attempting to deliberately misread my comment since your comment literally makes no sense in context.

"This" clearly means the original accusation in the OP.


Yes because to do otherwise would be exclusionist.


I've never graded tests or papers, but I always assumed cheating would be obvious because if you go to chat with a student about the problem, they will not have anything to say.


Well, if you put up an online exam that consists of only basic multiple choice tests, then you're making it a lot easier to cheat. That's the source of a lot of the trouble.


That's true, but questions that are difficult to cheat on are both difficult (time-consuming) to create, and difficult (time-consuming) to grade. Which means less time for other stuff, like actually teaching.


So I only have experience creating/grading tests as a TA, not a prof, but that wasn't my experience at all.

The total time spent in creating a few good, easy to discuss questions, the answering of which would demonstrate understanding, and then reading them, thinking about them, and coming up with a grade, was probably actually less than the amount of time it took to create meaningful multiple choice questions that didn't have any ambiguity, and which weren't easy to intuit the answer even without understanding. Doubly so when we did away with "partial credit" answers, but instead made it so each question was 1 point (or otherwise all or nothing), you had, say, 5 of them, and what we really were looking for was a paragraph that showed understanding (rather than checking off boxes in a rubric of "mentioned A, B, and C"; short essay questions, if you will), an expectation we communicated to the students.

And that's aside from the actual project based grades, which were better still.


Not so much. They take time but to me it is a reasonable amount of time.


Yes, it’s difficult, but that’s why being a teacher is a job.


True, but that is not the same as proving someone cheated. Besides, when you have too many students, you cannot talk to all of them one-by-one in any meaningful way. Or assess them in a meaningful way, to be honest.

By the way, where I work, management pushes for stuff like proctoring, and more students, and "measurable" results, and so on. As a teacher, I don't care much about the whole grading show.


> you have too many students,

Then the students are not getting what they pay for.


This doesn't scale, and teaching these days is expected to scale.


You assume you would know the problem and that body language is truth.


As a teacher you're pretty expected to know the problem you just asked, and there's no amount of body language which can replace an answer to the asked problem.


> …reinforces white supremacy, sexism, ableism, and transphobia.

Article proceeds to use a lot of words to not show any of these being true.

I am against it for the general dystopian surveillance normalization it encourages. We don’t need to throw a word salad of made up progressive insults against it to resist its implementation.


Agreed: even with room scans, you could defeat it by unrolling a giant cheatsheet on timer/remote control, and just stealing glances.

Still, ProctorU it's a major deterrence to cheating, I'm just not sure it's worth the cost.


They also are terrible at how they treat your local machine. I had to use them for a course I was in. I was having issues connecting because they kept insisting I used some Flash based tool. Their proctor started going into various settings on my Mac, clicking rather carelessly and what seemed like at random then declared the platform was unsupported. I immediately said "ok, so now your going to return my machine back to the state that it was in before you started messing around..." They immediately dodged the question and ended the chat. It took 3 more attempts to even get the test started and like a week to find all the damage they did to my settings. Terrible company.


They also lie about supporting Firefox. A family member needed to use this for a professional license. Following the instructions (which is basically turning off most security warnings and installing a bunch of malware) didn't work and the first thing support said was to install Chrome.

Shockingly, this was due to some JavaScript relying on an older Chrome proprietary API so there's no possible way they actually tested it against their alleged support matrix.


I once did a security review for a site that claimed to only support Chrome. I tried Firefox and used a UA switcher to fake being Chrome, and sure enough, the site didn't work. The page would load, but nothing would interact.

Turns out, their JS minifier was creating code that contained a syntax error. Chrome was able to make it work, but Firefox would silently error out. Rather than try to solve the problem, they blocked any browser that wasn't Chrome.

-_-


When IE8 was released:

Me, to <litigious tech company>: “Your JavaScript fails on IE8 because it now throws an exception when it attempts to set an invalid CSS value. I made a tiny patch but do you have an ETA for the fix?”

LTC support: “We don't beta test Microsoft's products for them!”

Me: “Okay, it was released this week. How's testing going?”

[a week passes]

LTC support manager: “Hey, can we get a copy of that patch to give to other customers?”

My employer at the time paid 7 figures annually for support.


Sounds like an open and shut case of false advertising, a crime.


Sure, got a few million dollars to bring a lawsuit knowing that if it starts go somewhere they'll issue a 3-line patch and blame the intern for not testing it?


I know it's just a saying, but I feel the need to de-mistify this.

Lawsuits don't cost millions. Court fees are absolutely never that high, and lawyers, while some may be expensive, are generally affordable for ~middle class (or even lower class if someone wants to do pro-bono work for you)

The whole "lawsuits cost millions" thing is a myth perpetuated by big corporations and further relied by normal folk who hear it from somewher else, which probably heard it from somewhere else, and so on.

When you read in the news "X company wasted $XX million in legal fees", what it actually means is "they stretched out the case with a team of very expensive corporate lawyers whoses price ranges are in the millions".


You're right that lawsuits don't always cost millions. However, they will cost at minimum tens of thousands of dollars. Filing fees are generally a few hundred dollars per document, and median lawyers' fees are somewhere around $300/hr, depending on jurisdiction. And--in the US--it is generally expected that you pay your lawyer's fees whether you won or lost.

The advice I have gotten from actual lawyers is that it's literally not worth it if you expect to get only a few thousand dollars.


This is why we need robot laywers that can Sue-as-a-Service for $5/hr. Just log into the website, type in who you want to sue, why you want to sue them, and it should take care of the rest. With enough proceedings from past cases it should be possible to train an algorithm to create the defense that is most likely to win.


Unfortunately, that doesn't really work. While there's a lot of court documents that are going to be highly formulaic and could plausibly be written almost Mad Libs style, there are several court documents that are going to rely very heavily on the unique factual nature of the case. Responses and replies to motions are going to fall into that latter category almost universally.


What about just funding public prosecutors? (Or rather more funding for public prosecutors)


I don't understand how you think this is solely one-sided.

> "they stretched out the case with a team of very expensive corporate lawyers whoses price ranges are in the millions".

Yes - The company stretched the case out with expensive lawyers: Do you think the other side is somehow not obligated to also continue dealing with that case?

Who pays my lawyer while the company stretches the case out? Oops - that's still me.

----

As someone who has actually retained a lawyer for dealing with a previous employer:

1 - Most places had zero interest if the money at play was less than 100k (ie: They would not take the case unless I had a potential win of 100k or more)

2 - They charge ~$350 an hour. Sometimes billing for "intern" work at ~$150 an hour instead. I make good money (~200k) and I can afford less than 23 days of lawyer time a year, assuming I spend my ENTIRE yearly income on it.


> The whole "lawsuits cost millions" thing is a myth

That may be, but they can easily cost many tens or hundreds of thousands of dollars. Lawyers typically bill at multiple hundreds of dollars an hour so it doesn't take a lot of hours to rack up five- or six-figure costs. That's high-stakes poker for most people.

I once sued a neighbor for their barking dog. It cost me over $10,000 before I pulled the plug.

https://blog.rongarret.info/2009/07/dog-days.html


Okay, yes, hopefully that's hyperbole but it's still a LOT more than most people are going to want to spend — that's why this works: if they were trying to take your house, sure, you'd lawyer up but when it's more like a principled stand on privacy, an awful lot of people are going to reasonably conclude that it's not worth the cost. This is the advantage to having, say, a government privacy regulator which has lawyers on staff whose entire job is to do things like this.

This is especially work considering with this particular company, which has a history of using legal threats to silence critics:

https://www.gofundme.com/f/stand-against-proctorio

I would DEFINITELY not jump at the chance to incur a similar reaction.


I worked with ProctorU in a university setting. Refreshingly, our (very large, public) school did not really want to use the platform, cautioned strongly against it, and were well aware of how invasive it was. They were worried about the potential for student outrage via media channels as a result of the race-based inaccuracies and biases and other issues that were coming up in the media. Oh and ProctorU had a data breach, which students happily reminded the institution of. A very small minority of instructors insisted on using it, and that's what I was helping with.

Seeing the backend of this tool was much more worrying than being subjected to it. Simply put, the platform is SO bad, that it could not be used as evidence even in the most blatant cheating cases - for example, the screen capture feed and the webcam feed were two separate files, neither of which was time stamped. If a student had a poor or marginal connection, these two recordings would get out of sync and could never be reconciled. It was so primitive it was laughable. That's on top of the issues you'll find documented elsewhere.


Of course it can be used to catch cheaters. The bar is just so low that ANY suspicion is cause for sounding the alarm.

In my SO's case, they told her wearing a sleeveless shirt was "inappropriate" during an online exam and told her to put something that covers her shoulder on so she ended up wearing a sweater in 80F inside. When we reached out to their support, they said there is nothing they can do about it and rattled off some standard script.


> In my SO's case, they told her wearing a sleeveless shirt was "inappropriate" during an online exam and told her to put something that covers her shoulder

For an adult how is that not a discrimination lawsuit?


That's what I said. A younger, more aggressive me would have just filed a small-claims case and let their legal department figure it out.

I worked with her to try to get it escalated and at least get the test fees refunded or compensated but didn't work. She just accepted it and scheduled another exam.


I'd love to see a writeup/documentation on the backend issues you mentioned here. Sounds egregious! Do you have a link to point me to?


I'm too intimidated to write it up publicly (though I detailed it all internally). A competitor of ProctorU, Proctorio, filed suit against a guy in a similar position to mine at a Canadian university. We're a big enough institution that others look to us for guidance, so I hope our de-facto moratorium on using these tools serves as an example.

https://www.eff.org/deeplinks/2021/02/student-surveillance-v...


This is my first comment on this site after lurking for a few years, I thought I would add my personal experience using ProctorU and how weird it feels to willing give up my personal privacy to take a test online.

I go to UOPeople, which is a tuition free online school and I am getting a 4 year computer science degree (for like $5k which is crazy), anyway of the 40 courses you have to take only 11 are proctored. UOP offers 2 choices for proctoring, A) find a real life proctor, or B) use ProctorU. I don’t have the luxury of finding a real person + with covid its more unlikely, so ProctorU it is.

All in all its not the worst thing in the world, but when I first read the requirements for the ProctorU testing environment and technical requirements I almost quit school completely to look for alternative paths. Some of the crazy requirements include: Testing space must have nothing on the walls, or floors. You can’t wear glasses while taking the exam. Your desk must be clear of everything besides the specific testing materials (calculator if your lucky, and maybe a pencil and paper) Your device needs a webcam, so that they can not only watch you for the entire 1 hour and 30 minutes where you take a test that determines 40% of your grade, but also so that you can show them each wall, floor and under your desk. And thats just the physical space requirements. I had to empty my closet and use my laptop to take this test because their is no way I just have an extra room for testing…

The digital requirements were pretty intense as well, access to folders they had to right too, chrome settings and a whole bunch of wack stuff. I created a dummy account just to take tests.

When you go into the program they have you download it acts like a 1 way mirror, you can hear the proctor(if you are lucky to have a human proctor) and they can watch you, your screen and hear you. I had some tech issues once and I was grateful to have a proctor with a sense of humor who was able to help me through it. I can’t say my privacy is worth a cheaper degree, but I hope that this doesn’t become normalized, because it is not a pleasant experience.


> You can’t wear glasses while taking the exam.

How is this not massively illegal? This is a clear ADA violation. I cannot see without glasses. Not like, things are a bit blurry, but like I have 20/800 vision that's correctable to 20/20 with glasses. Forcing me to take an exam without glasses is forcing me to fail an exam for a reason that has nothing to do with my academic abilities.


Yes, that's shocking. Many, many people are simply incapable of using a computer without glasses.


I can't, why is this even a requirement? Are they worried about a Google Glass like thing?

Actually surprised they don't require you to take the test naked.


Why not have an electrified anal plug inserted that they can zap if you look away from the test while we're at it?

This shit is crazy.


My guess would be so their algorithm can ID my face. Same reason I can't wear glasses in my passport photo.


I also had an issue with not being able to wear glasses. It was my fourth Sans cert and never had the issue before. The proctor also was sabotaging me by saying I requested "Technical support" 5 times during the exam, each time the timer running and some dude distracting me, despite me telling him I have no tech issues and to let me continue my exam. They would spend 5 minutes verifying that I indeed had no issues and then leave...

Very unprofessional, if not illegal due to discrimination and even though it was one of Sans's entry level certs, I barely passed, versus 90 + on all of their advanced ones without these issues.

Their reasoning was "The rules say no facial obstructions, your glasses block your face.". They have to hire the dumbest people to do these proctors.

I've never had any issues with other proctoring services, and things like pearson for Comptia and Microsoft were actually enjoyable. With proctoru each proctor seems to find some issue and you have to argue with them since it's completely unreasonable.


> All in all its not the worst thing in the world, but when I first read the requirements for the ProctorU testing environment and technical requirements I almost quit school completely to look for alternative paths. Some of the crazy requirements include: Testing space must have nothing on the walls, or floors. You can’t wear glasses while taking the exam. Your desk must be clear of everything besides the specific testing materials (calculator if your lucky, and maybe a pencil and paper) Your device needs a webcam, so that they can not only watch you for the entire 1 hour and 30 minutes where you take a test that determines 40% of your grade, but also so that you can show them each wall, floor and under your desk. And thats just the physical space requirements. I had to empty my closet and use my laptop to take this test because their is no way I just have an extra room for testing…

Talk about security theatrics ... I'd just stick the answers I want on a piece of paper just under the cupboard in reach of your feet or with a tape on the bottom of your desk and get it off when scratching your crotch.

> access to folders they had to right too

Why? They are recording all processes and the screen already. Again just theatrics. And if you can hide it from that they are never going to find it anyway.


>You can’t wear glasses while taking the exam

How is that legal? My face would need to be less than 10cm away from the screen. So - there goes using the camera to monitor where I am looking.


Also no eating or drinking during the exam, just because.


Yeah. I had to remove everything from the test taking room including furniture and a desk lamp.


> You can’t wear glasses while taking the exam

Don't know what class/proctor you took, but I took several UoP tests with ProctorU and never had issues with glasses.


If it's anything like my experience it's luck of the draw on the proctor and their "interpretation" of things. I used the same room and setup for every exam with them. Second to last one, the proctor said my room was 100% unacceptable. I protested and was told that there was no way this very same room ever was considered acceptable. So I moved into a new space and finally moved on to the test. My last exam with them? Used the old room and had no issues.


The proctors are random, poorly trained, people being paid near minimum wage in the American South that probably turnover every few weeks. I can't expect a high degree of judgment.


I'm another reluctant ProctorU user, and using it thrust upon me if I wanted to complete my online master in CS. It's spyware, and it really is that awful. At least for my tests, it's required that you buy an external camera, and scan the entire room before taking the test, it records you the entire time, and runs in the background (and foreground) with system level privileges. Taking a test this way is very stressful this way, compared to just walking into a building with just a pen and your phone on silent.

Nearly every college student during the pandemic had to use ProctorU in order to complete their classes, or a similar alternative. Quite disturbing the experience is normalized, and I wish there were an official OS level feature for "report all activity on the system from time X to Y", without having to use a sketchy third party app.

I wish the author the best of luck fighting the requirement to use ProctorU.


> I wish there were an official OS level feature for "report all activity on the system from time X to Y"

Oh please don't. This type of espionage should be discouraged, not officially supported.


The thing is, it still wouldn't be enough. Ex. I currently use magisk to tell Google Play and all apps that my phone isn't rooted, allowing me to use app features they would otherwise lock me out of.


If you're using MacOS, this API already exists: https://eclecticlight.co/2020/10/27/xprotect-what-do-we-know...


Well if you feel like putting something like this music holder above/behind your webcam that will do the trick to hold notes lol.

https://www.amazon.com/American-Plating-502N-Valve-Instrumen...


Stop testing people on memorization and then you don't have to worry about cheating. Allow people to recall data with resources typically available to them in the real world. This is the same issue I have with code/interview challenges that say don't use the internet. I would be a fool to not use the resources readily available to me or at least validate what I think I already know.


I think this common argument underrates the extent to which core factual mastery informs your ability to perform analysis and to synthesize arguments. For example, if given the exam question “Discuss the role of demagoguery in Athenian democratic politics in the Peloponnesian War.” and you need to look up whether the Sicilian Expedition happened before or after the death of Pericles, then you probably don’t really understand the role of Athenian political dysfunction during the war either.


I very much agree with you that knowing a lot of facts is very helpful for developing arguments. But I do wonder what students should be expected to produce on an exam.

Perhaps it's about the type of question asked. Knowing the Sicilian Expedition happened after the death of Pericles is different than knowing it happened in (checks Wikipedia) 413 BC.

But then knowing the year is very important for the world historical context. At that era in Greek history, events in Persia had more impact than events in Italy and Britain was essentially unknown to them. Yet here we are living in a world where the Parthenon friezes are in the British Museum. It's hard to put together the different moving parts without dates.


I have to look up all of those things, which means I didn't gain anything in class/assignments/reading, and no amount of Googling is going to help me if the person reading my answer is an informed person, whether my test is proctored or not.

Which is a long way to "yes, I agree."


nearly every exam I had at University, as a Computer Engineer, allowed 1 sheet of notes. So they were already not focusing on memorization even 20 years ago.


All of my B.Sc. Applied Physics exams including the finals were open note (anything in your own hand plus any duplicated sheets handed out in lectures). My finals were in 1977, Exeter Uni.

There was no limit on how many notes one brought in to an exam. Some of the weaker students turned up with rucksacks full of ring binders and the invigilators had frequently to admonish them to make less noise rustling the papers! Those students almost all failed or attained only a pass degree.

In my opinion this successfully weeded out those who thought that memorisation was enough. The exams typically never asked anything that could be answered simply by looking up the answer in notes or even the textbook.


Yes, the sheet of notes means you're not memorizing formulae, but I'm 95% sure the real motivator for professors to allow them is you learn when you put the notes together.


I just finished up my masters in CS from OMSCS at GT and some of the classes allow a sheet of notes. It's still a thing, even in the days of online education and proctor software.


I graduated that program a few years ago. The best final I had was Intro to HPC, you were allowed to use book, notes, internet, etc. The questions were open-ended and in-depth enough that the average on the test was still around 60-70 IIRC. You need a very deep understanding of the material to answer the questions sufficiently.


It could also help prevent unwanted communication between the students and the outside world (or between themselves).


> Stop testing people on memorization and then you don't have to worry about cheating

I'm not following. If you don't know the material, you have same incentive to cheat.


Most of these proctoring software easily detect Virtualbox, VMware, etc.

But QEMU/KVM which is the de-facto hypervisor on Linux is harder to detect. Even the others which I mentioned before can be hardened to evade detection.

And if you do a little bit of tinkering and intercept traffic, you can make it so that all the cheating reports from the "AI" never leaves you computer. I've never played with ProctorU but have experimented with a couple of other similar software. They usually send regular reports every five minutes and some anomaly reports (some extra software running on your computer, another person in room, face not visible, etc) when something happens. You need to intercept and modify traffic to not send these anomaly reports. This is easier if its browser based, but you need to install systemwide certs if its install-able software, and a lot more work if they utilize certificate pinning inside binary install-able software. I have never encountered the last one though.


Most virtual machine detection boils down to checking the CPUID hypervisor bit and vendor string. Luckily, it is possible to configure VMWare, VirtualBox and QEMU to spoof those values in the guest machine.


This sent me down the rabbit hole on defeating this... I cannot stand this sort of authoritarian horsesh...

Defeating malware's VM detection is very interesting.

Links for others if they're interested:

https://github.com/a0rtega/pafish collects all the best-known detection methods into a test suite.

This issue is interesting/has links for sure: https://github.com/spender-sandbox/cuckoo-modified/issues/45...


When I had to deal with Proctor-U, the software refused to run under a KVM VM. Detection of anything remotely VM hardware related, made it alert and the proctor refuse to continue.

That's after fighting with the software to have it installed in the VM to begin with.


Reminds me that I still want to develop that camera driver that replays fake footage. Maybe I will build a small dongle with some storage for video.

I heard that MS is requiring notebooks to have a HD front facing camera. Maybe they would still sign my device driver? If they don't, wouldn't that be a lawsuit waiting to happen?


One big problem with QEMU is that its virtual hard drives has the word "QEMU" hardcoded in their name, which proctoring software easily detects. When I checked there was a patch 1 year ago to make that configurable, but it was unmerged.


doesn't windows 11 run under a VM by default?


Unpopular take: ProctorU is trying to solve an unsolvable problem. The only way they can make it work is by dictating the configuration of the device taking the test, and even then they are going to have lots of technical problems with false positives and incompatible software. This leads to impractical outcomes like, "Oh, just borrow your friend's computer" and unsafe situations like, "oh, just allow us to scan your computer for content we don't like" and so on.

This is the digital equivalent of forcing students to be naked to take a test in person.


This is my general stance, as well.

Impossible requirements have been placed on at-home exams and proctored testing; and this company stood up.

Are they violating many moral and reasonable privacy codes to do it? Absolutely. It is a huge breech of ethics.

But universities and their professors asked for it.

Plenty of online educators already know how to (edit: lead) classes and give tests without it; but many, too, are either lazy or overburdened and have asked for this.


> But universities and their professors asked for it.

It is interesting that institutions that often have "department of ethics" are the first to be OK with awful products like online proctoring software.


I spent years running a school and am an edtech developer: education needs to (and probably will) evolve to suit the nature of remote learning. We're in this weird phase where we're trying to shoehorn models and constraints from the in-person learning paradigm into remote learning.


If this will involve getting rid of closed-book exams, I think the world can only benefit from it.


My wife has taught for 3 years at an online only state charter school (US). The single most difficult issue to solve (waste of time) is integration between foundation school management software such as Infinite Campus (where grades are kept) and third party learning packages (where assignments come from).


Funny you mention that. My work is building course content and assessment delivery systems that can be plugged into any LMS. It’s the leading (modular) solution to this exact problem.


Sure, and I think it's hard to find someone who is perfectly satisfied with the status quo.

Coming up with a replacement is the hard part.


Prediction: The replacements will come from adult education, not from traditional academia.


I've seen it done successfully at some uni's. Zoom for the lecture, and a far less invasive proctoring tool for the tests. You can even have in person and remote in the same class if the material is made available online.


Dartmouth medical school accused 17 students of cheating; they found over half of the accused students was due to erroneously generated data.

[1] https://www.nytimes.com/2021/05/09/technology/dartmouth-geis...


I think I would rather let a dozen cheaters "get away with it" than let even one innocent person have their academic career damaged by a false positive.


Cheating is like crime. A criminal needs to be right every time to not be caught but enforcement only needs to be right once. Even if you let once instance fall through the cracks, it's very unlikely it's an isolated instance.


Academic cheating also doesn't seem like a sustainable life plan. Even if you get your degree and land a job, what are you going to do when your boss and co-workers realize you don't actually know how to do the things you have a degree in?


You could stay in academia and contribute to the replication crisis.


All of these testing suites are absolutely intrusions of privacy. Proctorio had a portion of their EULA where they essentially stated that they will retain all of your information (your test results, your webcam footage, your microphone recording, etc.) for an undisclosed amount of time, and if the Proctorio brand were to ever be purchased by another private entity, that footage would become their property by extension. Pretty unbelievable stuff, being forced to take a test in an environment like that would probably cause me to spiral out into a nervous breakdown after a few minutes.


My favorite moment when my institution was evaluating proctoring software was when a faculty member asked: "What if a student's naked underaged sibling/child walks into frame during the exam? What is your corporate policy on evaluation and retention of child pornography?"

Shockingly, smarmy ed-tech hucksters don't have a good answer to this one.


Don’t read the EULA; just mindlessly click ‘Agree’ like everybody else. The corporations have already won, so there’s no need to give yourself anxiety over it.


It's not like you have a choice. If you refuse the EULA you will fail the class.


Relevant discussion from earlier today: The Magnificent Bribe https://news.ycombinator.com/item?id=29154178

> Surrender to the power of complex technological systems — allow them to oversee, track, quantify, guide, manipulate, grade, nudge, and surveil you — and the system will offer you back an appealing share in its spoils. What is good for the growth of the technological system is presented as also being good for the individual, and as proof of this, here is something new and shiny. Sure, that shiny new thing is keeping tabs on you (and feeding all of that information back to the larger technological system), but it also lets you do things you genuinely could not do before.....The danger, however, was that “once one opts for the system no further choice remains.”


I'm waiting for the world to catch up with the fact that looking up information while you are working is a core part of any real job. Do I care if my Linux Security Professional spends a few minutes looking up information on the internet before taking some action? It's not the case that anyone can solve any problem so long as they have a search engine. Without domain knowledge, an open ended web search is not going to lead to a convincing answer except for the most trivial questions.

This extends to coding interviews as well. Using the resources at one's disposal to get a sense of the landscape before diving into algorithms must surely be part of the job, right? What do I care if a developer needs a quick reminder before diving into a solution, or even reads up a bit and scans someone else's code before answering?

What is the value in ensuring that people have perfect recall if this is something that will almost never be necessary in a real world job?


I took an astronomy course a few years ago which had a fairly forward-thinking (if a bit lazy) instructor. The initial tests were fairly conventional. But for the intense test right before the final, he gave us a comprehensive take-home. With the full assumption we'd be hitting Google hard for the more difficult questions. He knew this was a complex topic. Thought we'd learn more and retain more with a test where we had to show some initiative in finding the right answers without the stress of having to remember it on the spot.

Plus, it doubled as a study guide for the actual final which was only a couple of weeks later. I thought it was a remarkably kind thing to do. Took out a little stress. Gave even the struggling students an easy "A". And it worked as a comprehensive guide to almost everything we covered.


You are not thinking about the real risk. It is not about preventing a candidate to Google a few things on the side. It is to prevent a completely different person from doing the exam instead of the candidate and simply sending them the answers. And don't think it is just an abstract threat, there are whole businesses built around that. Unfortunately there is not much you can do to have exams remotely and be sure the candidate is the one doing the exam without being extremely invasive.


This is a good point that I had not, in fact, considered.


I bombed once an interview at a major bank when the hiring manager insisted what I would do if something happened to the system, and there would be no internet to search for answers. When I answered that nobody would notice the system was crashed if there was no internet didn't please him very much.


Ironically enough, you might not have internet access at your datacenter (reception issues, so no wifi or phone data, and switchport connections are often secured or don't route to a public internet). And things get really entertaining when your whole office network is down.

It's not an odd question. "Okay, the whole subnet where your credentials server used to be is now a smoking hole in the ground, and IT forgot to pay the fiber bill last month. What do you wish you'd done three years earlier to address this problem?"


It's always seemed a little off. I've been coding for 15 or so years and I still sometimes completely blank on certain javascript array functions and need to google it.


I invite candidates to "error out" to internet resources and use their own professional judgement about what is and is not okay.

To set people at ease I tell them up front that only one candidate has crossed the line (googled the solution) and everyone else has made perfectly appropriate choices; "elseif or elif?" and small details like that.


In consulting I produce better thought-out and constructed recommendations if I parse a book and previously-delivered decks of slides. Hell, even ISO standards.

On the other hand, at least a basic level of recollection is necessary for quick thinking in meetings, you don't always have the time to look up documentation.


And you can perfectly evaluate that by setting a realistic (!) time limit and judging the quality of the answers. It's absolutely irrelevant how much recollection you have if you still manage to solve the problem efficiently.


I teach mostly MS students, and at the moment I'm sitting in front of a class taking an exam, so as you might guess I have fairly strong feelings about proctoring.

Basically I think that online evaluations have to be completely different than in-person ones. Proctoring is fairly trivial and non-intrusive for in-person tests - don't open your laptop, don't talk to the person next to you. For big tests in small rooms I assign seating.

Online is different. Basically there is no reasonable way to keep someone from hiring someone on Chegg to do their entire test for them, and the most horrible proctoring software in the world won't stop a determined cheater from balancing a cell phone at the bottom of their laptop screen...

You really need to use a different approach with online assessments, and honestly I don't know if it's possible to use online tests for some of the things that we use in-person tests for.


I had the same issue with ProctorU. Installing Windows on a 64GB USB and booting off that anytime I had to take an exam solved the problem to my satisfaction.


I suppose I'm extra paranoid because I have a dedicated (older) computer for courses that require some sort of installed software, including Zoom. I don't want my unmounted hard drives available to the software.


I suppose the alternative would be unplugging them, but your approach works too.


lock the disks using ATA commands then unlock them later


Still gives access to the EFI firmware. Hard pass.


At least Proctorio (despite suing a college student under DMCA for reversing their software to show the extent of its capabilities [1]) doesn't go to this far a level. It's a browser extension that I can install for an exam, and remove afterward. It gets microphone, screen, and camera inputs, and permissions are handled through the browser.

[1] https://news.ycombinator.com/item?id=26898651


Just as a small reminder what universities are all about: The dominant reason for the foundation of the Sorbonne was a surge in heresies. The principal idea was to prepare for the next heresy and being able to respond quickly to challenges yet unknown. The university is essentially about a wager on the future, not about mastering the present.

Which brings us to what a university should be about:

* Understanding principles

* Being able to draw connections from principles to novel ends

* Being able to argue your point

* Being able to discern, whether a point is argued properly or not (according to these rules and the rules of the particular discipline/field)

* Knowing about the current state of the art/ideas and their relation to principles

Nothing of this includes a use case for spyware like this. It actually indicates that the institution does not manage at least one of those aforementioned goals. And it it's an indication for that institution rather answering to those challenges by a strict regime of established procedures, which is exactly what a university is all not about.


> It looks like I can go to one of their regulated test centres and take the exam there.

ProctorU is absolutely, positively insane, but the alternative sounds quite reasonable.


I think a lot of people underestimate how rampant cheating is. I knew people in undergrad that would get entire groups together to take online exams. While invasive, ProcotU makes this far more difficult.

You could make the argument this is how real life works, but we'll need to radically redesign the current curriculum if the internet becomes fair game. Long term this is a must, but short term not making an effort to stop cheating will corrupt the entire institution - anyone that doesn't cheat is at a severe disadvantage.


The whole concept of cheating vs using available resources to solve a problem doesn't really make sense IMO. We have a smartphone with google/wolfram alpha/etc with us at all times as well as stack overflow if something isn't working. Why try to memorize the whole damn dictionary this is outdated and archaic...?

In the real world my learning of a new piece of software proceeds as follows: the github repository/manual of the software -> stackoverflow/biostars -> google with whatever you want to do.


> The whole notion of online proctoring seems pretty whack to me also: what world are we training and testing people to live in? The real world has internet, you can search for stuff, you can work from home and take a break.

I agree! I wish more institutions viewed the world as Stanford's Honor Code does. It predates the Internet by many decades:

> Open-book Requirement: As stated in the Interpretations of the Honor Code, “If take-home examinations are given, they should not be closed-book examinations…” Open-book exams place no limitations on the materials or resources that a student may access during the exam.

https://communitystandards.stanford.edu/resources/faculty-an....


This sort of thing is why we sometimes need a platform regulator / "App Store".

Customers can't defend themselves against such intrusions of privacy from their school/government.

And employers too. They want you to respond to emails at all hours, and ask for way too much control over your device in exchange. This is slowly being changed with Android work profile and Apple "user enrollments", thankfully.


It needs to be a legal requirement with teeth: these tools would never be allowed through the app store approval process but that's not a problem as long as they're allowed to simply say you have to buy a laptop instead.

One alternate way to prevent this would be liability: if the institutions using this had to reimburse all of their users for every security hole in their mandatory software or the risk due to the security settings they require you to disable, it'd complete change the calculation for them.


I sure am glad we have Google and Apple looking out for us, instead of the traditional unions.


Or alternatively basic privacy laws that cover students, etc too.


I don't really understand your point. ProctorU has apps on both Play Store and App Store, how does Apple and Google save us here? Seems they have no problem hosting software for ProctorU.


What they have on the App Store and Play Store doesn't come close to the invasiveness of their desktop apps.

> Seems they have no problem hosting software for ProctorU.

The restrictions are on what their apps can do, not on who published it.


App stores are actually an enabler for this sort of cancer; Apple is already speaking about reporting people to the police, they can obviously add an anti-cheat as well. And there will be no way escaping that.


I hate proctoring software with a passion and I will never ever ask my students to use it. But I think it's important to understand where the demand for this software is coming from.

In the "old" days, 20 students walked into a room, sat down, and took a test in silence with pencil and paper while the instructor stood there. Now, 350 students go wherever they want, with classmates and many internet-connected devices around them, and take a test unmonitored.

Sure, you tell them the exam can be open-note as long as the work alone, that's easy. But they can work together, and that's hard to detect unless they literally copy.

The #1 problem is Chegg. They can screenshot and post the questions to Chegg and get answers back very quickly. And they do. To add to this, many schools have a large culture of cheating. Like > 10% of students or more will cheat on unproctored take-home tests.

That 10-30% basically ruin exams for everyone, but most of all the instructor. You can't give people relaxed take-home exams, too many will cheat. You have to give a strict time limit. You have to put a ton of effort into making questions obscure or idiosyncratic, rather than just giving standard problems from years past. And remember you have 350 exams to grade, so you can't ask very deep questions.

So I can't get that mad at my colleagues who use proctoring software. I can't really offer a reasonable alternative.


What's the legality of all this? I assume refusing to install ProctorU will fail your exam, so you don't really have a choice, when the process presents you with all kinds of check boxes and consent forms it's mostly doing the motions of informed consent, but at no step do you have a real choice. Surely no sane person would allow this blatant invasion of privacy willingly, this is practically duress.


The fact that a linux security course uses proctoru for exam is the entire sarcastic point.


I was taking a certificate exam earlier this year, and looked into taking it via ProctorU or the old-fashioned way, at a testing center.

After looking at their system requirements and the draconian measures needed to just be able to take my exam (they also require you to have a webcam, which I don't have and to real-time broadcast your private room, where you will be taking the exam, which is a non-starter for me), I realized it would take me longer to set up correctly, than it would take me to drive there, take the test and be back home.

I heard some people say we live in a high trust society, but that trust only seems to be going one way. If I don't trust a corporation, I have no agency or power to act on that. But if they don't trust me, they impose these insane draconian measures without any oversight that preclude me from progressing in life, professionally or otherwise. It's fucking insane.


I run an online Javascript course and we have what I think is a better solution than ProctorU:

1. We don’t require the student to install any software - we use the video streaming feature of the browser on their laptop to record the student’s screen and webcam

2. We get the student to place their phone camera behind them at the 4 o’clock or 8 o’clock position with a view of the student’s desk, then we record the scene using the browser on the phone

In this way we get 3 video feeds - the screen and two cameras - which we can monitor live and record. You might argue that this is still too creepy, but if we cannot see the student while they work it’s just too easy to cheat and that would make our credentials worthless.


if this is paid for and this requirements were only disclosed after the fact, there might be enough money to be had here to entice a lawyer into making the course provider abandon the software.

That's more or less the only silver lining I can think of here.


For the certification program at Redis University (https://university.redis.com/), we previously used ProctorU because that's what other folks were doing -- proctored exams for certification.

Buuut, after a while, we were like, why subject people to this? This is crazy. And why even charge for certs anyway? I'm happy we're done with proctoring!


You know I attended The University of Memcached but years later when I asked for a copy of my degree certificate ... well you know the rest.


This looks like something the UK Gov could use for SELT exams, visa & immigration tests or other similar secure tests approved by the government.

At the moment these are taken in person in secure buildings where you are identified at the entrance, your belongings are stored in a locker, and the exam itself is taken in a secure room on computers with no/limited internet connection, etc. You are timed and monitored.


ProctorU is a mandatory trojan that ETS (among others) require. It's also poorly built to the point where it changes random Windows settings, takes over all kinds of management features, requires full admin access on a personal computer (not a university or corporate managed one) while burning 50% CPU.

Naturally, it's next to impossible to remove once installed. I speak from personal experience


That's.. fcking insane. Dystopian has been worn thin.. This needs a whole new word, something with more punch. I guess criminal has also lost much of its luster.. Which generation is creating the ProctorU software? Who's determining its features? It can't be the "boomers" can it? Is it a GenY thing?

This is so GenY!


A company tried to get me to use a similar service as part of the interview process. Here's the ridiculous list of preparations they expected:

----

Keys to a successful Exam Day:

Run the “Computer Requirements Check” at least a day before the appointment

Disable your pop-up blocker

Click “connect to proctor” to begin

Be alone in room

Have a clear desk and workspace

Be connected to a power source

Your mobile phone and headphones must be out of reach

Have your government issued ID ready

Be prepared to perform a room scan

You must be in the webcam view throughout the exam. ----

Same dystopian bullshit. A room scan?! What business of theirs is the contents of my room? What if I live in a studio and it's my whole living space? (it is) Massive privacy invasion that is completely unnecessary.

I told them no thanks. I don't need to pass a test like a college student with ~20 years of experience under my belt, which they knew.


What I generally don't understand is the stuff they're protecting against - god forbid you google how to do a regex replace in a string - is stuff you can just do on a regular boring workday.


I’ve never seen a critique of proctoring software that offers an alternate solution to the problem (remotely delivering a skills assessment that is used to allocate resources in an environment known to have rampant cheating)


At least in STEM:

Create an exercise that contains a technical term that does not exist, and create a page containg it with a plausible, but wrong solution for the excercise. Make sure that the page is easily found with Google. Give everyone who solved it using the wrong solution a failing mark.

Personalize every exam. Create a pool of exercises and choose n exercises per student, based on the student id. Easily done if the sheets are already LaTeX anyway.

Create heavy time pressure. Cheating is very hard when even completing all exercises regularly is almost impossible. (Lovingly called "Zeitklausur" in German, lit. "time exam", it's normal that students are unable to finish those in time)

Create exams that don't just test the ability to vomit knowledge, but test the ability to use that knowledge, and let students explain in their own words.

Replace the exam with multiple small projects and presentations.

All of those things were used by different chairs / departments in my university :-)

Nothing will prevent "someone else writes the exam for another student" with absolute certainty, yes. But neither does proctoring software.


Best way is do away with exams and have projects graded instead. Math write a survey paper on the subject that demonstrates expertise, then orally defend it on live chat where you can't easily cheat, that's what my school did but only had european accreditation whereas regional US likes 20th century examination style.


Anything subjectively graded like a project is a big vector for introducing evaluator bias.


There's a discussion upthread right now about making the whole problem irrelevant by switching to open book tests and/or having a 5-minute conversation with the student to make sure that they actually have some clue what they're talking about.


Have students sit for an exam and then follow up with a question or two (from a pool of N), live, to see if they actually understand the material. We're paying hundreds or thousands each, they can easily afford to spend the time.


And the ones that do propose alternatives do so from a very narrow point of view. Oral exams are much more time consuming and simply not practical in many situations. Projects and seminar papers have similar problems, especially in early semesters.


Nothing in the video is particularly egregious other than having to use your own hardware to do it. All those precautions are followed if you’re in a controlled test environment; Since they can’t control your home and computer, they must ensure you’re not cheating.

I don’t see a problem with that, you’re taking a test.

That said, I would not like to install this on my computer and would 100% use a separate user to do so. As long as it runs in the home directory I will trash it afterwards. I even use Zoom from my phone rather than installing it on my computer, but if I had to I’d do the same.


There is an alternative solution to this problem with far less privacy implications....

Examind.io


I would probably buy a ~$200 HP Stream and use that...

Unfair having to spend such money, but textbooks aren't that much more expensive, and you can use it for more than one course too


is there any non-dystopian spyware?


I don't think you're phrasing your idea well. Perhaps you mean that any piece of software that allows surveillance has the propensity to trend toward being misused? It seems to me spyware is a loaded word and has its connotations. What has happened here has been on a massive scale very quickly in a way we have not seen before.

Edit: For example, remote sensors placed in a power plant or foundry where people work also would constitute surveillance. But it is in an environment where carefully calibrated machines can otherwise fail catastrophically.


Its a dumb, but critical question. I enjoy the question as it is, as it inspires some good thinking.

For me, it made me realize that the word 'dystopian' in the title adds no facts to anything, its just a judgement. A negative one.


I would say if it does not spy all the time it runs it can't be spyware - does Teamviewer spy all the time or can it just be made to spy as a side effect of its main purpose?


Teamviewer came to my mind as well but I'm not sure desktop monitoring is quite on the same level as a camera pointed at one's face. Unless teamviewer made some strides in the years I haven't used it..

To add, I suppose what I am really getting at is that it may be more useful to address ProctorU in particular rather than bring it under the same umbrella as other less used surveillance software.


The dystopian part of it is that it's institutionalized.


People are pretty enthusiastic about spyware that helps parents control their children.


good example! Makes me sort of sad your username doesn't end with a t though.


I am looking for an exam proctoring solution at my employer (an accredited online university). What are my best solutions? Any solution can be hacked (i.e. workaround it’s limitations). And without proctoring there is no guarantee students will not have someone else attempt the exam on their behalf or send the exam to someone to solve it for them. The only approach I see, but not favored by the Deans, is testing centers (prometric, etc.).

Any suggestions?



It seems to me they should send each student a device for the test with it's own mobile internet so they can use it and turn it off / send it back after.


The student can just use an auxiliary device.


Wow... People cheating their way through exams :D hopefully im not gonna ever hire one. Ive met ppl who said they go to interview, lie about their skills and say its 50/50... Usually get kicked out after their lack of knowledge surfaces in a month or two. Gods please save us from ppl who cheat on their IT certs and unis. And on top of that bragging about... 0mf gds, grow up.


I'm sympathetic to not requiring students to install spyware on their personal computers, but I also TAed an introductory CS class for freshman engineers and, wow, there was a lot of cheating. No more than 5% of students, but it was enough where you wanted to do something about it, and remote education makes cheating on exams a lot easier.


The use of Proctorio seems to mostly be a U.S. issue.

I have never seen it used over here (Poland). My friends from other European countries haven't either, at least when I asked them.

I wouldn't be surprised if GDPR prevents them from collecting most of that data. The fact that most colleges here are state-run might also contribute.


Yep, the GDPR saves us here, thank goodness for that.

There are better ways anyway, IMO. At my (german) university, most exams are so personalized (random exercises from a pool in random order, student ids used as const values in calculations, groups A / B / C, etc) that cheating is pretty hard.


Doesn't surprise me. Proctoring in general is pretty invasive. The last time I did a standardized test (GRE), I was pat down, tattoos photographed, led into a room with camera and a 24/7 human monitor just outside through a window, given industrial hearing protection (to prevent cheating ig?).


A while back I had a remote job interview that required me install spyware on my computer and turn my web cam on so it can record my face while I was doing some computer science puzzle.

It made me not want to work at the company anyway. If they treated me like that during the interview, imagine your day to day.


A friend of mine told me about this Crossover company who screened their remote workers minute by minute, taking photos with their webcam and monitoring their every mouse move and keypress.

Sounds quite depressing if true.


A good designed test shouldn't depend on rote memorization, so it would require much less intrusive spying.

Example: design a database schema for a grocery store with such and such requirements... you either know or don't know how to do it but you won't find the answer on the web.


> Linux/Unix operating systems

Well, I guess there's no way for me to use it anyway!

> Virtual Machines

It's my policy that closed source software gets installed in a virtual machine. Others need to abide by my policy if it's running on MY equipment.


As a workaround, someone could buy a new laptop from a major retailer right before the exam, install the software, take the test, uninstall the software, then return the laptop for a refund.


Oh wow. I just had an epiphany. This is why companies are still treating VR as a serious product that might have a large market, even as AR seems much better-suited to a mass consumer market and surely not that much farther off than good VR (perhaps closer, even). VR headsets with just a few sensors would make excellent isolation & monitoring tools for things like this.

Now I get why Facebook (Meta, whatever) decided to get into it. I'd not been able to piece it together until just now. Of course there's a spyware angle front & center. It all fits now. Their main market's probably intended to be business, but education makes sense, too.


I, for one, love the people who would be proudly installing this because they have "nothing to hide" while chiding those who are (appropriately) concerned.


In this scenario couldn't you just run it in a VM? Or would their software trip once they realize they can't see vacation photos on your desktop?


The article states the software won't run in a VM


This kind of software almost universally detects and refuses to run in VMs.


It is not hard to create a hardened VM. If anti-cheats can't detect it, neither can some off-brand user mode teaching software


It's harder than you think, and remember that the consequence is not “I can't play a game until I revert my config” but “I was reported to my college for an ethics violation and now my $$$ degree is in question” or “My professional organization has been told that I attempted to cheat and the certification I need to keep my job is in jeopardy”.

There are many things which are technically possible which are not a favorable cost-benefit for most people. This is in the same category as those guys who relied on technically being able to fly without showing ID to the TSA — there's a reason why it was mostly affluent white men flying solo, because the potential downsides are much greater for most other categories.


Got some links to resources about doing this? Would be interested in having a hardened VM on-hand for things like this.


KVM is probably your best bet on Linux and VMware the best on Windows. https://github.com/hzqst/VmwareHardenedLoader works for VMware but doesn't work against some modern anti-cheats, but KVM universally works against anti-cheats when configured properly with RTDSC spoofing and such


VM detection is usually pretty bad. They just look for magic strings that are easy enough to fake.


The difference between "easy to fake" and "hellishly difficult" is the authors clicking next-next-finish in VMProtect or not.


There are github repos to harden VMware against VMProtect, let alone KVM


they still have to interact the system, and make system calls.


If your customer is loosing 10k per minute while you google basic linux command youre just being unprofessional dickhead.


What is stopping someone from having a separate machine right next to the infested one?


Proctorio (competing software) can require students to "scan" their room by moving the camera around, making it more difficult to hide a second device.


How is ProctorU different from Microsoft or Apple or Facebook?


This sort of thing drives me crazy because it just needs someone to ask the question "Why is this necessary?"[0]

The reality is that a sufficiently motivated cheat will cheat and will more than likely get away with it regardless of counter-measures when there's not a proctor watching over their shoulder[1]. If the goal is to test proficiency, there has to be better ways that work remotely.

Back in High School, "open book" tests were common in my Chemistry and Physics classes. The "open book" rule wasn't to help over privileged kids pass the test -- these were among the hardest tests; the book was useful only for referring to the myriad of formulas which a High School student was not expected to commit to memory, but if you were relying on that reference to tell you exactly what you needed to do to solve the problem, forget it.

This, obviously, falls apart in many contexts -- "open internet" tests are difficult to write in many subjects, simply because there are tools designed to answer those questions, immediately[2], but it likely just requires a little creativity.

From my own personal experience: I ended up taking a certification exam which was offered online due to COVID. They attempted to re-create the security of "testing in a center" by spyware and procedure (photographing the room from every angle, myself, my drivers license), all trivial to defeat if I was so motivated -- the procedure was even communicated in advance. And it was a certification for writing software[3].

I took practice tests, online, which turned out to be nearly word-for-word what I saw on the test. I knew the answers, anyway, but had I wanted to, I could have spent my entire study time memorizing the test answers -- not technically cheating, but the test has abjectly failed to indicate anything about my expertise. A more useful approach would have been to present me with a program utilizing the features that they wished to test me on, then present a set of multiple choice questions asking questions about that code. With a diversity of test programs and frequent changes, it would reduce the probability that a Google search would yield an immediate answer, testing the candidates ability to solve the problem using all of the tools that are available in the field.

Assuming that "testing is still the most reasonable way to assess skill", which is another argument in itself that falls victim to footnote "0", the point is to make irrelevant the forms of cheating that the spyware is attempting to prevent. Both are losing battles against cheating, but the former is less so, and certainly less consumer-hostile.

[0] The classic "Why do we do this?" with the most common answer being "Because that's what we've always done"

[1] Just off the top of my head, a small camera in the room aimed at the screen and a discrete -- in-ear, like my daughter's (Bluetooth) hearing aids -- ear-phone Bluetooth connected to a mobile phone to a third party in another room with a computer being used to research answers.

[2] And in a lot of contexts it still doesn't matter. If there's a tool that changes how that question is solved, and you're testing whether or not someone can solve that specific problem -- not whether or not that somebody can write a tool to solve that problem -- wouldn't it be more intelligent if they solved it using the most appropriate tool?

[3] Before I get grief; it was requested that I take it out of the expectation that I would require no study and was needed due to us being a MS Certificate Professional shop.


> [1] Just off the top of my head, a small camera in the room aimed at the screen and a discrete -- in-ear, like my daughter's (Bluetooth) hearing aids -- ear-phone Bluetooth connected to a mobile phone to a third party in another room with a computer being used to research answers.

Watching the video that this article links to, a couple of the requirements are, among other things:

* To show the proctor all four walls of the room, and underneath the desk,

* To take out any earbuds you may have.

These requirements alone would probably mean your cheating solution wouldn't work.

(I hate that I have to say that because I do not want to advocate for ProctorU here, but in this case these requirements would probably do what they were intended to do.)


I was entertained until this undoubtedly vaccinated hero used Covid as an excuse for not taking an in-person exam, despite writing this article in November of 2021. It's okay if you don't want to do something, but stop exploiting Covid. If you're still doing this it sounds ridiculous. Get over it.


How do enroll in a Msc program and not read how exams are done? I took remote university for a while, you either go to some exam invigilator and pay the $20 for them to monitor you (some campus libraries it's free) or you have someone volunteer to do it who meets the criteria. Big deal this article is a twitter quality rant.

The problem is of course regional accreditation rules of proctored exams




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: