I agree. It takes the user out of their intended action (using your product) and puts them somewhere distracting (their email).
I've also seen it confuse users who aren't used to it.
It's great from a tech/security perspective but I wouldn't put it into my own product for those reasons. I definitely would not make it the only login mechanism.
That's technically how ads work in most places. You view ads instead of paying a fee. You see this in the context of free apps, free news articles, etc.
A virtual card is just as legitimate as a physical card. The numbers are just account ids basically. In my business, I setup virtual cards for many vendors.
Sounds like you're trying to detect fraud. There are fraud detection platforms particularly designed for checkout fraud. Instead of re-inventing the wheel with a theory you have, I would look into some of those solutions that are fairly robust taking into account many different factors and often data from across merchants.
Typically the ones who cares about virtual vs. physical cards are the questionable vendors trying to push shady, automated, hard-to-cancel subscription services or alikes to customers.
Virtual cards are designed just for this purpose - to let consumers gain upper hand in shady business practices.
Virtual cards are also nice for setting up rules and alerts. I can setup a virtual card for a single merchant, then tell it to alert or block charges over a certain threshold.
I haven't found these fractional sites to be very useful for development work. The rates are low and the few dev jobs already have 100s of applications.
Probably mediocre. Even absent being a source of full-time income accounting for the fact that there's going to be a lot of unpaid/dead time, I personally won't take a job for less than $1K for a day+-ish unless it's really interesting or I'm doing a favor for someone I know.
(One of the issues with short-term fractional work as an income source is that you can end up only being paid for a fairly small percentage of your total time. If the rate is good, that may be fine as a part-time job. But in my prior industry analyst stint, we actually had quite good day rates but we spent most of our time keeping up with industry happenings and writing free stuff.)
SaaS products are just businesses. This like asking "Is retail dead? Every retail store seems to already exist."
There are always business opportunities to be had. You could probably open a business selling gravel and be wildly successful despite there already being tons of gravel suppliers in your area.
Computer Science is about computing theory, algorithms and such, not practical software engineering. You can actually learn computer science without programming.
Game Design should probably be an art school discipline.
Going to college is about being able to convince some entity with money to give you some of that money in exchange for your labor so then you can exchange that money to support your addiction to food, clothes and shelter.
It is perfectly reasonable to spend four years of life and tens of thousands of dollar and expect to learn something you can use to get a job.
I don't pragmatically disagree with you, but this is a debatable point.
Technically going to college is to learn knowledge about a topic. For computer science, that would be learning the theory of computing.
What you're talking about is a trade school to become a software engineer.
There are people who want to get a PhD in computer science or go on to do research of some kind. Those people do need a theory based computer science curriculum, not just a software engineering trade school.
We currently conflate these two things. It mostly works out but we should really be more explicit about it in my opinion. I went to a top CS program and I learned 90% of what I needed to know to be a software engineer at my first job, not in school. Something about that feels off to me.
How many of those 100K students do you think are interested in learning the theory of computer science or are interested in getting a job?
How many jobs are available in the US for pure “computer scientist” and in an ultra competitive job market, why would anyone hire someone who doesn’t know how to code when they can get someone for around the same price who can?
I don't disagree with any of this. I'm just explaining that universities are not trade schools, even though we often treat them like they are.
The theory is that somebody with a background in computer science is going be more likely to be capable at being a software engineer. There is a high correlation between the two. But nonetheless, that does not make a computer science program a software engineering trade school.
Wouldn’t someone with a computer science background and knowing how to actual program be better?
If I need someone who can actual code your standard enterprise development - where most developers work - who would I be better off hiring, someone who knows theory or someone who knows how to code?
Do doctors, lawyers, teachers or any other professionals graduate just knowing “theory”?
Medical school and law school are professional programs you take after your bachelors degree.
I've seen some discussion about making engineering programs something you do after a pre-engineering bachelor degree, simply because you can't learn enough in 4 years to really learn those fields.
I think a better comparison to what you're saying is that getting a history degree doesn't qualify you to become a history teacher, you still need to take a teaching program on top of it.
I personally double majored in EE and CS and neither program prepared me for working in those fields. They just gave me the theoretically background and not very much application experience.
For EE, you barely learn enough physics and background stuff to just start doing a bit of application engineering in your senior year. Another 2 years on top of that is definitely required for job training. And that's one of the program they're thinking of making a 4+2 years thing.
I think you could modify a CS program to have less theory and more practical application classes, but that would be software engineering trade school, not computer science. I did do a software engineering masters on top my CS undergrad and that was more helpful to a degree for job preparation. I will say I haven't used very much of the computer science theory I learned at my jobs so that wasn't a good use of time for job training.
I graduated with a degree in computer science from a state school back in the 90s and it tried to prepare people for a job. Would I have been prepared to design and write a double entry, networked, data entry program with 10 screens in C with a database backend by myself as my first job with just the degree? Maybe?
I learned C in college. But I learned to program in assembly and a little BASIC before going to college by being self taught.
Yeah we did a DS&A class, networking, and I believe a database class was offered soon after I graduated. But we also learned how to code. Now it seems quaint that some of the coding classes were in COBOL and FORTRAN. But I have a classmate that is still working for the same payment processing he got a job at in 1995 and still doing COBOL.
This is a specific 4 year degree to become a teacher and the first link that showed up when I was looking for a math teacher (my mom is a retired math teacher)
People go to college and get a bachelors with the hope of getting a job with the exception of those getting an undergrad degree with the expectation of doing post grad work for a specialty that requires it with the hopes of learning skills to get a job.
This is a 4 year nursing degree. After passing the exam you are qualified to be a nurse. Again not a “trade school”
“Focus: building on a base of fundamentals in programming and computational theory to provide a solid foundation of knowledge and skills for applying digital processes effectively to issues of broad interest in a global society.”
Applied learning is the opposite of theory and they teach “programming”.
And here they are focus on “careers” none of which is about getting a PHD or learning for the sake of learning.
Part of the problem with this whole thread is that there are lots of colleges out there, and lots of students, and they all have different goals, and expectations.
I went to university in the 90s and got degree in comp science. Their stance was not "preparing us for a job" but rather creating an understanding of computers, algorithms, and so on. The theory was "learning how they work" means that on the job you make more informed decisions.
For example - in databases understanding theory (3rd normal form, searching, sorting etc) leads to better database design (in any database). Learning everything about one database (in our case Informix) is not necessary.
For my career, this approach worked well. I've been able to easily transition to new things, the fundamentals don't change, just the syntax.
But, when choosing a college, it's important to align your expectations with their strategy. Most disillusionment either college happens when these don't align.
I quit playing most first person video games due to motion sickness. The first game I remember being unplayable was Half Life 2 and that was even after changing things like FOV.
I've also seen it confuse users who aren't used to it.
It's great from a tech/security perspective but I wouldn't put it into my own product for those reasons. I definitely would not make it the only login mechanism.
reply