Samsung seems to be targetting a sweet spot. "Costs less than Apple, superficially looks like an iPhone, product lineup includes smaller form factors, good enough."
It doesn't work for me, but that's because I courageously use my headphone jack.
Don't be surprised when the answer is "not much". Apply supply and demand to electric power generation. If your grid rate is getting hiked then so is the market price of used solar.
Texas State Bar is still a thing. This means that it has split from the American Bar Association, but the legal system of Texas is still part of the US Legal system.
> Texas State Bar is still a thing. This means that it has split from the American Bar Association, but the legal system of Texas is still part of the US Legal system.
Lawyer here, member of Texas and California bars. There seems to be a misunderstanding here:
1. A state bar is what a lawyer has to belong to in order to practice regularly in that state (with some exceptions, e.g., for federal-court practice). Example: To practice regularly in California, a lawyer must be a member of the State Bar of California. That normally requires passing a bar exam or (in some states if you're an experienced lawyer), getting in by "reciprocity."
AFAIK, every state bar is separately regulated by the highest court of the state (and, sometimes, by state statute). Example: The State Bar of Texas is subject to regulations promulgated by the Supreme Court of Texas.
2. In contrast, The ABA is a purely-voluntary private association of lawyers. A lawyer doesn't have to belong to the ABA in order to be a lawyer or practice law.
3. IIRC, the ABA's governing body includes liaisons from state bars. But AFAIK, there's never been any official governing connection between the ABA and any state bar.
4. The ABA's law-school accreditation standards [0] are a way for states to adopt uniform standards, thus avoiding the cost of developing individual standards (and of complying with a variety of standards). Those ABA standards are roughly analogous to national model building codes for plumbing, etc. — they're adopted by various jurisdictions but have little or no legal standing in any given jurisdiction unless adopted.
Do you think malware creators find out by reading HN or github? I don't understand the vitriol, the request "Github should take a harder stance" could have a chilling effect on security researchers, pushing high impact exploits deeper underground.
Another point is that Firstly Github shouldn't take a harder stance but considering its microsoft and even if One might argue that Github does take in this case and it actually does.
This would really end up doing not much because buying a domain name and such hosting should be easy.
There are some service providers who will only comply in things if you provide if and only a legal complaint which is genuine and valid (like a court order) and I think no Court can order for something like this because I feel like there is / must be a legal backing for genuinely writing "this tool is for educational/research purposes" and its actually so, so I don't really understand if github's stance would even matter in the end because if you need to get court order to remove it in the end, then github will comply it with it as well (even more so than those providers even)
I don't understand what the OP wants, like should this be obscure in some tor .onion forum for hackers or should this be on github so that people can read about this and learn abotu this vector and patch up in their servers where they may have thought it was safe but they didn't know about this issue exists in the first place! (because a hacker might still use obscure persons but a sysadmin might not comparatively)
There isn't vitriol, or atleast I didn't mean it that way. The point I was trying to make is that I've seen malicious code like viruses and keyloggers and rootkits being distributed via github and they use the 'this is for education' as a cop-out when the rest of the repo makes it extremely obvious what the real intention is
Malware is very easy to build. Competent threat actors don't need to rely on open source software, and incompetent ones can buy what they use from malware authors who sell their stuff in various forums. Concerns similar to yours about 'upgrading' the capabilities of threat actors were raised when NSA made Ghidra public, yet the NSA considers the move itself to have been good (https://www.nsa.gov/Press-Room/News-Highlights/Article/Artic...).
People will build malware. It is actually both fun and educational. Them sharing it makes the world aware of it, and when people are aware of it, they tend to adjust their security posture for the better if they feel threatened by it. Good cybersecurity research & development raises the bar for the industry and makes the world more secure.
Have you ever heard the phrase:
"To stop a hacker you have to think like a hacker."
Thats cyber security 101. Without tthe hackers knowledge or programs...you're just a victim or target. But, with this knowledge made available, now you are aware of this program/possibility. Its like when companys deploy honeypot servers to capture the methods & use cases of hackers attacking the server, to build stronger security against their methods and techniques.
If you want to see a comparison against an even broader set of open source compression algos, this is lzbench (it's linked directly from the ZXC github page)
lzbench has added ZXC to its suite. This makes a nice apples to apples comparison possible.
"The QWERTY layout became popular with the success of the Remington No. 2 of 1878...
"The 0 key was added and standardized in its modern position early in the history of the typewriter, but the 1 and exclamation point were left off some typewriter keyboards into the 1970s."
There's always a few oddball variations. But desk work will probably use a qwerty keyboard in the year 2100
A college level approach could look at the line between Math/Science/Physics and Philosophy. One thing from the article that stood out to me was that the introduction to their approach started with a problem about classifying a traffic light. Is it red or green?
But the accompanying XY plot showed samples that overlapped or at least were ambiguous. I immediately lost a lot of my interest in their approach, because traffic lights by design are very clearly red, or green. There aren't mauve or taupe lights that the local populace laughs at and says, "yes, that's mostly red."
I like the idea of studying math by using ML examples. I'm guessing this is a first step and future education will have better examples to learn from.
> traffic lights by design are very clearly red, or green
I suspect you feel this because you are observing the output of a very sophisticated image processing pipeline in your own head. When you are dealing with raw matrixes of rgb values it all becomes a lot more fuzzy. Especially when you encounter different illuminations, exposures and the cropping of the traffic light has noise on it. Not saying it is some intractably hard machine vision problem, because it is not. But there is some variety and fuzzyness there in the raw sensor measurements.
That HDMI Forum does not allow TVs to be sold with DisplayPort is a massive reason I think they deserve to have their building surrounded by angry people with pitchforks and torches. Anti-competitive abusers, doing awful things to prevent a better world.
DisplayPort actually makes sense as a digital protocol, where-as HDMI inherits all the insane baggage of the analog past & just sucks. HDMI is so awful.
No, they don't put DP on because every $ of hardware they fit to the TV needs to provide value. DP requires a large board component that may need manual handling, circuit traces (+ decoupling) and silicon on the chip to interface. It then requires software support in the stack and that needs testing/validation.
The percentage of people who will actually use DP to connect their TV vs HDMI is tiny. Even people who do have DisplayPort on their monitors will often times connect it with HDMI just because it's the more familiar connector. I spent a decade working in that area and we literally were debating about spending cents on devices that retailed for hundreds, or thousands. The secondary problem that drives that is that ~90% of TVs sold use the same family of chips from MStar, so even if you wanted to go off-track and make something special, you can only do it from off-the-shelf silicon unless you pay a fortune for your own spin of the silicon. If you want to do that then you better commit to buying >1m chips or they won't get out of bed.
HDMI forum was founded by mostly TV manufacturers, they're not interested in constraining the market in that way. It's all just been market consolidation and making TVs cheaper through tighter integration.
Oh wow, that explains a lot, I sort of always figured it was just market momentum that meant you never see tv's with a display port. sort of like
... we need a digital video link
VESA develops DVI
... market gap for tv's identified
hdmif develops HDMI which is DVI with an audio channel
... while technically a minor feature that audio link was the killer feature for digital tv's and led to hdmi being the popular choice for tv's
VESA develops displayport a packet(vs streaming for DVI and hdmi) based digital link, it's packet nature allows for several interesting features including sending audio, and multiple screens.
... no tv's use it, while display port is better than hdmi it is not better enough to make a difference to the end user and so hdmi remains normal for tv's, you can find a few computer monitor with DP but you have to seek them out.
I will have to see if there is some sort of stupid "additional licensing cost" if a tv is produced with displayport, that would explain so much. I don't claim that there are no tv's with DP but I certainly have never seen one.
It doesn't work for me, but that's because I courageously use my headphone jack.
reply