Hacker News new | past | comments | ask | show | jobs | submit login
If you can use open source, you can build hardware (redeem-tomorrow.com)
335 points by gustavo_f on Sept 5, 2023 | hide | past | favorite | 134 comments



I'm going to complicate this a bit and say "If you can use open source, you can prototype hardware"

Part of building hardware is making it robust enough to exist in meat space long term. That means thinking about how the humidity sensor is affected by ambient conditions (including the packaging bag, that one has bit me in the past) and having a plan for re-calibration if drift becomes too great. That means picking connectors for your wire harnesses that can handle the number of times you expect to connect/disconnect them over the course of your things lifespan. That means tuning the length of that wire harness so you can't damage it when you open the enclosure to change the battery or whatever. It means thinking about how ambient conditions affect the rest of the design, so you don't have to clean the contacts on all the wire harnesses every so often, because you didn't get gold contacts for both the harnesses and the connectors, and you live in a high humidity environment.

Don't get me wrong, I'm self-taught on virtually all of these points, it is achievable for the hobbyist. Just understand that swapping out one smart relay controller for another is pretty far from having a smart relay controller you'd even give to your sister-in-law for Christmas.


I’m a big proponent of open source hardware but as your post shows it often involves skills of many disciplines that requires vigorous thought or trial and error. Electronics and physics are unforgiving in a way processors are not.

Even after reaching the prototype phase, the open source hardware is probably only useful to one person: it’s creator.

There is a big difference between making a prototype and detailing the build in sufficient detail other hobbyists can replicate it / modify / use it. Documenting hardware is substantially harder than documenting software. If the project is cool a bunch of people will be excited to jump in; some of these people have zero experience soldering or ordering laser cut parts or whatever. Supporting them is hard.

Then another step up to sell the design to other hobbyists, even just a few extra copies on Tindie.

And then a huge step up from that to selling to the general public, where suddenly FCC interference certifications are needed and the company is liable if the design burns down a few houses. There’s a reason firms making hardware have real engineers on staff held to professional standards. Plus all the cash flow and business concerns when the marginal cost per unit isn’t under 1 cent like software.

Each of these steps often involves multiple iterations of hardware and therefore lead time and cost.


There are a lot of OSHW projects I'd love to work on, but the main thing that holds me back is knowing they'd basically go nowhere. I can't post them online for others like with code.

Nobody is going to build it, the physical building of it is way harder than the design, anyone who could build it is too busy building their own projects that will go in the junk drawer in a week.

I would love to work at a real OSHW company, making IoT gadgets and stuff that for production and sale as polished commercial products with a software ecosystem behind them... but I lack a degree, live in Montana, and don't drive, and there are not many companies like that(And most of them are making expensive FOSS phones that don't run normal apps, cryptocurrency stuff, or glorified dev boards kinda pretending to be products)


Really depends on the specific pronect of course, but there's definitely some "posted online like code" projects out there. It's a more technical target audience of course, but I've seen plenty of projects with design files included ready to be sent to a PCB manufacturer. Two categories I can easily think of are mechanical keyboards and modular synthesizers.


Keyboards are an interesting case, people are so into them, and also specifically want them to be custom made just for them, so people are willing to build or even commission.

I keep thinking that maybe high end flashlights could be the same way, I can definitely think of a few features that don't show up in your typical light, and that it might be cool to try to make a few boutique lights to sell, but my business knowledge isn't quite up to that, and I don't exactly have much desire to do a whole lot of independent work, I much prefer having an employer.


The thing i see with flashlights is that the physical object is pretty complex and has to face thermal, power and environmental constraints that would destroy any keyboard.

Maybe it could start as a gut replacement for an existing cheap flashlight and grow from there.


Am I underestimating how complex a battery, bulb, and switch are? What complexity is in a modern flashlight? I remember proudly making a flashlight in 1984 or so from scavenged components, including silvering my own reflector.


It's simple at the low end, GP's talking about much higher power & light output gear than you get in a Christmas cracker. Marine applications, cycling, caving, etc.


There's definitely strong thermal concerns with a flashlight now that everything is LED with lithium cellsand the chips want to stay cool.

Unless maybe there's a demand for simpler lights somewhere? I tend to forget that non-LED light sources are actually a thing, but that it's pretty cool to make one from scratch at the level of hand silvered reflectors!

I'd probably be putting LED and battery temperature sensing in anything I made, so I wouldn't be too worried about safety, but it would be a mechanical challenge that would likely take some iteration to get right on the mechanical side.


A pretty common feature is single-button dimming. For LEDs, that actually requires a microcontroller and a more sophisticated power regulation scheme than just a current limiting resistor.

There's also custom colors, flash patterns, etc


Wouldn't an additional button be simpler than a microcontroller. Button A turns on ~30% of the lights, Button B turns on the remaining ~70% of the lights. That also adds a bit of redundancy if a connection fails.

I've only been caving twice. Dimmer lights are great - but they don't need to be on a single switch. Quite the opposite, putting them on a single switch means that one _has_ to cycle through the bright option when that is not wanted.


Nitecore uses half-press to cycle modes and full press for on/off. 2 modes really isn't enough on a modern light, the nice ones have a very dim mode, a few normal ones, and a turbo with 1000+ lumens.

Buttons take up space, and cost more than electronics(Not counting craptastic ones that will break in a hurry), plus you'd still need all the other driver electronics like the switching converter and the lithium protection chip.

The microcontroller is probably the most reliable thing on a flashlight, simple doesn't always mean cheap and reliable.


Flashlight design is neither my hobby nor my profession. But I own a bunch of flashlights, and single-button dimming seems to be table stakes for a lot of commercially available head lamps and lanterns.

Also, with the right micro, a second button (with relevant environmental protection) may well be more expensive than putting that micro and voltage regulator under a big blob of black epoxy.


I think that's the easy part, it's just a really simple(by embedded project standards) PCB and firmware.

They have premade modules, but I'd have to do my own, since most of the point of making yet another type of light would be Bluetooth so you never lose it, can customize the modes, and it can also serve as a a motion sensor that pings your phone if someone is messing with your stuff


Flashlight nerds are crazy, it's definitely as big a market as custom keyboards


There’s also ergogen, which is a project that generates the PCB for your split keyboard based on a few inputs. Then just send it to your favorite online pcb maker and you just have to solder the components .

And there are a bunch of dactyl manuform case generators. (Which the online pcb fabs are now also offering to print in your favorite material)


Eh. As a counter point.

I live in Montana, have a degree in an unrelated field but still work on IOT projects.

I work mostly from home or private office and have never needed to drive for course of work.

Not sure how you exist in Montana without driving.

If you’d like to move into that field, it’s certainly not easy but it is possible.


Even seasoned professionals can get caught in the gap between 'prototype' and 'production grade' especially for things that are on the margin of what can be done with a particular hardware recipe. That's when component variation can cause your product yield to go straight into the sewer.


> Supporting them is hard.

Ding!

We had a bespoke wireless entry system for our hackerspace which kinda sucked. Eventually the board switched it out for OpenPath (which also sucks--to be fair).

Why?

Support. The board can now call someone and say "We pay you. Fix this."

Support is the bane of consumer products. I really wish we had some way to counter this.


And all of this together still doesn't solve the bigger problem with DIY hardware, which is the DIY itself.

It it goes wrong, you cannot buy a new one or hire repairperson at a sane price. If it has a software side, it will probably need maintainence. If you want one, there's a large chance you might want another to expand your project.

While yes, I am able to design a reliable hardware device, unless you have a large budget it will not be immune to direct baseball bat hits or spilling epoxy in the connector. So, in practice, if you ask me to build something for you, I'll try to find a way to do it with off the shelf parts as much as possible.

Which sucks, because electronics projects are super fun, but the fun is dampened by the fact that in the end you have this completely unique irreplaceable thing that becomes a liability if you use it for anything important, which is generally tied to one application and becomes junk if you no longer need it, unlike the more general purpose off the shelf stuff.

ESPHome and Amazon modules plus 3D printing gives a pretty good balance for a lot of things. Reconfigurable, machine-soldered reliability, a prefab software stack, but still enough flexibility to build novel things.


> It it goes wrong, you cannot buy a new one or hire repairperson at a sane price. If it has a software side, it will probably need maintainence.

It's not clear to me that the alternative provides these either. Just thinking about some of the appliance-type things I've had issues with lately: my oven would've made more sense to replace than hire a repair person, and my ISP-provided router is running their latest firmware which is horribly out of date...


Yeah, but your ISP router does work, and you could replace it if it didn't. Oven repair might be expensive but it's possible (Usually, some places you might have to wait a month, like I did when the heater went out), and not that expensive.

With DIY stuff, sometimes you can't replace it because there's no equivalent, you've invented novel functionality, and bought other things that depend on it.

Like, one time, when I had a very different mindset, I made a controllable light that used a non-DMX protocol, and took power over XT60. I don't know where the special USB adapter for it is.

If "Number of direct dependents" and "Total of all dependents that are in some way customized" are more than just a few, then it's pretty nice to have standard stuff.

Building things that are effectively clones of what you could just buy isn't that interesting to me, but making novel things comes with future unpredictability.

I like to look for projects where there either just isn't any commercial thing at any reasonable price that would work, or the thing is non-critical, or there aren't many design decisions in other things based on the custom thing I'm doing.


I think that's shifting the goalposts a bit - one can buy weird proprietary stuff from a company or product line that disappears, or build stuff that uses standard interfaces.


Just buy some spare tapeouts from digikey when building ¯\_(ツ)_/¯


Doesn't solve the issue of nobody else knowing how to build it, and the ones that do often somehow making 60$ an hour, now you're stuck with this thing that could be your responsibility at any time.


  > but the fun is dampened by the fact that in the end you have this completely unique irreplaceable thing
For many, having a completely unique irreplaceable thing _is_ the appeal.


It's wonderful for a piece of art, not so much for a daily use functional object that could cause inconvenience if anything happens to it, and which has to be frequently moved and handled and such.

I suppose people have widely varying tolerances and desires for unpredictability in daily life, some people really seem to like actually using unique tools, self hosting, customizing their computer with original scripts, etc.


Agreed. Also, the kinds of passive safety needed to not burn your house down in the event of a code error or other design issue.

The hardware design is the last line of defense before you can do real-world damage.

Things like fuses, ESD and surge protection, watchdog timers, often get overlooked in a hobbyist or even open-source design... it takes (sometimes hard-won) experience to know when these things are required.


There are also some physical constraints as well. I have an essential tremor - painting warhammer minis and doing anything with a sodering gun are forever out of my reach.

That all said - I have written firmware for things that other people have wired and it's quite fun!


I have heard that using magnifying glasses or a microscope can help suppress shaking in the hands: it has a weirdly helpful interaction with the hand-eye feedback loop. Dunno if it would work for you, but it might be worth trying?


I'll have to give that a try - though it didn't seem to help my father very much. He was a model railroader and just got used to taking several dozen passes at painting cars and locomotives. For me myself I've found that stress tends to make it worse so it's a bit of a vicious cycle where trying to suppress shaking can spur it on more. Advice is never unappreciated though so thank you for you consideration!


FWIW, I have a terrible hand tremor as well. I've found that with a good, wide-aperture magnifying glass or binocular microscope, I'm able to do soldering and even chip-level wirebonding. Having an armrest or other surface I can support my forearm/wrist on, with a tight structural loop to the target, can also help a lot.


Second the armrest suggestion, it makes all the difference. You can be rock steady if you don't have to support the weight of your whole arm from the shoulder, the closer you can rest your hand to where the action is the more stable you'll become. Another thing that helps is breath control.


As much as I agree with this, buying off the shelf things, especially on the extremes of "very niche" (ok this kinda how's under your "hobbyist design" or "so general there are hundreds of knockoff versions with various cost cutting measures", there is no guarantee that they have thought of all (or any) of the required safety measures... Check out Big Clive on YouTube if you haven't already, and aren't afraid of knowing about all the different ways products skimp on safety.


watchdog timer is not hardware protection. It might seem like it is, as the timer itself is in hardware, and it does occasionally protect from hardware related lockups, but it's all too easy mistake to stick a watchdog refresh in a timer somewhere that still works even when rest of the code went tits up


This reminds me of when I was in college(EE) and working at an electronics store. A small aircraft owner wanted help with a regulator to (IIRC) drop 28v down to 12v and handle a few amps. I resisted helping design a solution but he kept pushing so I suggested putting a couple TO-3 packaged 7812s in parallel. We bench tested it and it worked so he went on his way. A few years later I learned you never do that as one regulator can end up handling the load and it ends up overloaded. Instead you use a pass-transistor(or other mechanism) to allow a single regulator to do the job. I still wonder if that guy's plane ended up going down in flames...


You can get away with it if there is some resistance in series with each one but yeah, the enemy here is that each of the regulators will have slightly different voltage and the unlucky one with highest will handle most of the current.

Although I'd imagine you got lucky here because IIRC this particular's chip voltage drops with temperature a tiny bit so technically the one that starts to heat up would drop voltage, letting other pick up the slack


> "If you can use open source, you can prototype hardware"

You can prototype some hardware. I’ve looked into trying to build some stuff that goes beyond what a little prepackaged MCU dev board can do, and I can’t wrap my head around it. Too much stuff involved that in no good at.


Compared to software skills those are relatively easy to learn though and they have a longer best-before date than any kind of language/framework kind of knowledge.


Rather the opposite, I'd say.

Software is deterministic and quite easy to reason about. It either works, or it doesn't. Hardware relies on actual physics, and even minute changes can be the difference between working perfectly fine and not working at all.

A lot of hardware design is based on rules-of-thumb and institutional knowledge. Learning those as a hobbyist is incredibly difficult, and most of the time you essentially end up cargo culting what everyone else is doing - and there is no guarantee that everyone else is doing the right thing either! It is really easy to end up wasting hundreds if not thousands of dollars like this.

This is exactly why companies like Adafruit have become so big. They take care of all the hard part, and provide the hobbyists with essentially a bunch of lego bricks which neatly click together. The only thing you have to do yourself is... the software.


I've never spent as much time on hardware bugs as I've spent on software bugs. If the software you've worked on is 'easy to reason about' then you've led a charmed life!

That's probably also why all software is 'bug free' ;)

But seriously: both software and hardware have their unique challenges. But those can be overcome and just like software hardware can be 'unit tested' by breaking down circuitry into manageable chunks. Adafruit is a success simply because they fill a need: the ability to create bespoke gadgets without investing lot of $ or learning a new skill. The market to programmers, not to hardware people, though I'm sure there is some overlap as well due to the convenience. But those skills are not substantially harder than software skills, they are just different.

I'm kind of lucky: I got into software through hardware rather than the other way around. To me software was an infinite parts budget (bounded by RAM limitations, usually). Hardware was a running expense, computing a one-time expense (or so I thought, hah!). So I simply got more mileage out of my pocket money and Saturday job earnings by saving for a computer rather than by spending it on various hardware components.


I’m not so sure about that. Learning a programming language for example is pretty easy, iterative, and had quick feedback for me. Learning years worth of math makes my eyes glaze over. I do agree on the latter half though, regarding how they’re useful for much longer.


You won't need 'years worth of math' to be able to prototype hardware. There is plenty of tooling now that will take the sting out of timing and other nasty little details and there is plenty of hardware where those details don't even matter all that much.

Good starterpoint: and FPGA evaluation board, such as Digilent's offerings. Those pack enormous power in a tiny setup and will teach you a ton of very valuable skills.

If that looks like a hit you can decide to deepen your knowledge.


You won't need "years worth of math" to be able to prototype digital hardware.

As soon as there's a non-trivial analog element - anything frequency-dependent, resonant, exceptionally resistant to RF interference, or switching significant current - you absolutely do need that math.

You can model resonant filters with DSP, but you still need to understand z-plane digital models. It doesn't hurt to have some idea how they relate to s-plane analog models.

Cook-book tinkering is plenty fun, but you really can make things explode or burst into flames if your project is switching and/or carrying any significant load.


I've built massive RF stuff with high school math and it worked quite well, better than some off the shelf stuff, and that's after a nice session with a spectrum analyzer to make sure you don't end up spewing garbage all over the higher bands. What really helps is to have access to good measuring tools and to know how to use them, as well as people with more experience than you to help guide you.

Stuff exploding or bursting into flames I've seen exactly once, on one of the most trivial circuits I ever built: a small boost converter for a windmill to charge batteries in low wind conditions. It worked extremely well. Until I disconnected the battery for service and then the boost converter kept on increasing its output voltage until the capacitors let out the magic smoke. Other than that stuff occasionally breaks. Oh, and if you do do RF stuff: beware of RF burns, that is a real risk, coils and capacitors in high power RF circuits should be treated with proper respect.

I'd be much more wary of Lithium-Ion batteries than analog stuff and buck-boost converters are cheaper to source as complete units than to build yourself (though you definitely can if you want). Your typical hobbyist isn't going to start off by building themselves an large inverter or a HVDC interconnect. They're going to build amplifiers, other audio gear and maybe some measuring kit or digital devices. Sound generators, function generators and so on.

By the time you reach the stage where you need to design a resonant LC circuit you'll have picked up a lot of working knowledge and some of that will tell you what bits to avoid and what bits you can probably handle.

I know plenty of HAMs that know enough math to be dangerous but they usually would not be able to do really complex stuff without access to tools (though I also know some HAMs that definitely would be able to do really complex stuff, they also have the corresponding higher level license).

Let's not pretend that everybody that builds electronics for hobby purposes is a math wizard, it just isn't true. Though it definitely doesn't hurt to have a basic understanding of RC and LC circuitry and to understand how to use op amps and other interesting components like that. Applying those is vastly different from designing them from scratch.

Also: quite a few people have a ton of fun just building kits and slowly expanding their knowledge and there is absolutely nothing wrong with that. At the highest levels you will need that math, but there is plenty of interesting stuff to be done lower on the ladder. HN is the last place where I would expect such gatekeeping.


I think anytime you move past the lumped circuit model you can run into trouble. This includes digital circuits with fast edge rates. On the other hand a close reading of a component manufacturer's application notes and reference schematics can help a lot of people who may have only limited formal training in electrical engineering.


Any tips for getting started for a software engineer? I built a couple CPUs in a circuit simulator, I'd love to get these things running on silicon of some kind and benchmark them against each other, but I wonder if I would be biting off more than I can chew.


I'd definitely go the FPGA route initially, it has software like advantages such as being able to reprogram stuff without having to tear it all up and do it all over. It also elegantly avoids having to build up soldering skills (which is a bit of a pain with SMD) Once you get the hang of that some simple CMOS circuits hole-through on a breadboard would be a gateway drug to building stuff for real. If you want another in-between step I'd go for a kit of some sort, something that you want to have anyway but would rather build yourself, there are quite a few producers of such kits and they range in complexity from 'blinking LED' to 'build your own glass teletype' and everything in between (and even more complex).

Compared to software it is a costly hobby though, and it also occupies more space beyond just a laptop. And it can be quite messy.


Can you give an example? There may well be an easily accessed IC for it.


Mostly I was looking trying to do custom RF stuff, trying to create custom hardware. Could have used an SDR, but I think I still would need a solid handle on the math for that.


That's definitely the realm of "actual electrical engineer". There's a lot you can learn to get most of the way there without the math, but to actually understand why you've got e.g. an impedance matching network on your antenna trace requires some mathematical gymnastics that's easiest to get in school. That can feel pretty frustrating, but on the other hand a lot of the hardware at work there can be had off-the-shelf as modules, so you don't have to do any RF black magic. Just standard build-quality questions.


I recently started at a big connector manufacturer... And I have to say, there is so much more that goes into connectors, even "simple" ones, than what is obvious to the end user.


Want to second this comment. Spatial efficiency, insert and locking mechanisms, electrical characteristics, mechanical strength characteristics, machine pickability/placeability, solderability, thermal characteristics, thermal impact to overall system, environmental seal properties including water/saline air/weak acids/weak bases, orientation guarantees, safety factors such as susceptibility of shorting with different categories of pollutant including metal dust, longevity/susceptibility of contacts to build-up of dust, formal fire engineering properties (including combustion temperature/off-gassing), fabrication cost including line purchase/scheduling/maintenance and hard tooling fabrication/longevity/maintenance, color coding for human error reduction in assembling and maintenance, documentation and translation, distribution and recalls, etc. That's just the connector. Now look at regulatory outlook, availability and all of the above mentioned concerns for the relevant cables and their fabrication processes...


No different than writing software to a releasable stage. Prototyping is a very important stage of building hardware, it's how you test, if you get yourself that far, you can get yourself to a production build. It does mean a bunch of reading and research though. I've been doing embedded/electrical/mechanical systems for decades from consumer grade products to large industrial machines which have to last decades. There are often hard-earned lessons along the way, but many end up jumping out at you. Solving some of those problems sometimes require significant rethinks, but a lot of stuff is not that tricky. Main thing is having good tools to investigate problems.


> No different than writing software to a releasable stage.

It's extremely different, imo.

Releasing buggy software to prod: no biggy, hotfixed in a couple of hours

Releasing buggy hardware: recalls, mass customer dissastifaction.


IMHO the oversight in this response is that having good tools to investigate problems essentially equates to (1) tens of thousands of dollars in hardware; (2) permanent lab space; (3) the relevant base capital; (4) supply chain access; (5) a university equivalent level education in physics and electronics; and (6) at least a few years to hone your craft. This is realistically a five to ten year commitment, perhaps a little less full time if enthused and adequately capitalized.


Some things that are cheaper at a low scale is quite expensive at scale, 3D printing is obvious here, your way to consume less 3D printing may be opposite to the way that regular plastic manufacturer does, so you need to adapt your process to the process of your suppliers.


I wasn't even touching manufacturing at scale, because sometimes you really do need just the one. But it should not be so fragile you can't carry it from the garage to thermostat mounting position.

I learned that the hard way when I automated the heat lamp that I put in my chicken coop. Having to noodle around with screw terminals while being pecked at by an angry rooster was not a great time.


The rooster didn't approve of your soldering technique?


Rooster thought he should have used chicken wire.


3D printing looks like linear scaling and from what I saw 3D printing services are pretty cheap.

Yeah they are more expensive than running printer in your garage because they need to earn money too but it's not like the price grows with volume


As someone who sells a low-volume niche product (as a sideline), the problem isn’t that 3D printing costs grow per-unit, but rather that they don’t meet people’s intuition of what plastic things should cost.

A box that might cost $0.25 if injection molded might be $25 if 3D printed.


I've breadboarded a number of projects, but always awem to hit a wall when faced with the concerns you describe. Do you have any pointers for how to gain the knowledge to get past this? Right now I feel like I dont even know what I dont know.


A lot of it comes down to being mindful of what you're spending the most time on during assembly, but some of it is just hard-won. But I've learned a lot from reading Hackaday.

Some simple things that you shouldn't have to learn the hard way (but most people do):

Make sure your wiring contacts are electrochemically compatible. Gold-to-gold is safe in almost every household environment.

Strain relieve every wire. Solder is not meant to be structural.

Every circuit component degrades over time. Heat, humidity, and dust accelerates that process. Make a plan to mitigate the ingress of each, and a plan to account for that degradation.

Learn to design simple breakout-board carrier boards. The best breadboard layouts are still worse than a mediocre PCB, because the PCB doesn't have flywires to catch on literally everything.

Make sure you include mechanical support points for your designs, and pick the right size and material for your mechanical supports.

All of this to say, your hardware thing is a thing first, and an expression of your software/firmware design second. If it cannot physically survive being that physical thing, the elegance or resiliency of your code is meaningless.


Please improve this (I have only dabbled), but I’ll add a couple points as well:

- Don’t run data lines and power lines right next to each other (electric signals flow through a field surrounding the trace/wire, not in or on the metal itself)

- PCB pros avoid right angles for the same reason. Bevel your corners. (You see examples of this on every board if you’re not sure what I mean)

- Verify PCB traces with a multimeter before soldering components to it (or if it’s been assembled by the PCB manufacturer, verify everything before powering it on for the first time)


> - PCB pros avoid right angles for the same reason. Bevel your corners. (You see examples of this on every board if you’re not sure what I mean)

If your design suffers from the consequences of this, your reach has probably exceeded your grasp. Its true that you can get noise from sharp corners, but unless you're running SPI at maximum speed, it probably won't cause any bugs in your project. And if you need to run that fast, you're going to run into other, less straightforward signal integrity problems too.

PCBs with right angle trances look ugly though. So I might still judge you for it, but only if you also wear white before Memorial Day.


And right angle traces are more prone to delamination, which is the major reason why you want to bevel your corners.


Etching would be difficult. ie, if you bend two traces side by side with 90 degree corner, watch the etching around the corner. ie copper may be left on the inner angle of the 90-degree turn.

So, I do not use 90 degree turns for this reason, if not for the EMI reason.


Yes, the EMI thing is real but not at typical hobbyist frequency ranges. But the mechanical aspects are far, far more important and right angles are simply a bad idea. Ideally the lines are smoothly flowing (like say at the bottom of the old KIM boards), but that's not how auto routers place the traces. 45 degree angles in succession are a good enough compromise. They're mechanically reasonably strong, they don't delaminate and can be easily placed and used for bus patterns with closely spaced traces that will reliably etch without the outside being eaten up and the inside being 'too late'.


What frequency ranges does it start to become an issue? Cheap controllers can be on in the hundreds of MHz now.


>The limiting factor that will determine whether any resonances are excited is the size of the square region in a right-angle PCB trace. In particular, the lateral size of the region will be approximately equal to the quarter wavelength of the lowest order resonance, so this gives you a good baseline for estimating the fundamental resonance frequencies. The remaining harmonics will be approximately odd multiples of the fundamental frequency. If we allow a very generous trace width of 30 mils with an effective dielectric constant of 3.5, the lowest order frequency is 112 GHz! If we take this as a knee frequency for a digital signal, this is equivalent to a 3 ps rise time, which is well-below that for commercially available digital components. [0]

That whole article addresses the myths surrounding right angle traces pretty effectively.

0. https://www.nwengineeringllc.com/article/right-angle-pcb-tra...


>Don’t run data lines and power lines right next to each other (electric signals flow through a field surrounding the trace/wire, not in or on the metal itself)

Not true. The electrons certainly do travel within the copper. The movement of the electrons generates a magnetic field around the conductor, but the electricity does not "flow through a field surrounding the trace/wire". The electric power absolutely does flow through the metal itself.

>PCB pros avoid right angles for the same reason.

This is a myth except maybe in very rare cases. Most hobbyists aren't ever going to have a problem with right angle traces.

https://www.nwengineeringllc.com/article/right-angle-pcb-tra...

>Verify PCB traces with a multimeter before soldering components to it (or if it’s been assembled by the PCB manufacturer, verify everything before powering it on for the first time)

You should be sure that your design works before sending it to be assembled. If you designed the PCB with proper software that does analysis between the schematic and the PCB design, then there really shouldn't be any surprises that would require you to verify any PCB traces with a multimeter before soldering components. Sure you may have had it manufactured by a crap PCB company, but it's unlikely, PCBs have gotten really easy to make. Software like KiCad if used properly make it practically foolproof to design a PCB that matches the schematic.

Designing the schematic is another matter though, it's very easy for a noob to get that part completely wrong and testing PCB traces with a multimeter is not going to fix that.

>or if it’s been assembled by the PCB manufacturer, verify everything before powering it on for the first time

Not sure what that would accomplish. What are you going to test? Many components can't even be tested unless power is applied. Seems like you're suggesting superstition more than practical knowledge about hardware design and manufacture.


Thank you.


Take things apart. Fix things, while observing how they break. There are amazing online videos on how to repair virtually anything that goes wrong with a home appliance. Make things that improve your life at home. These are things that a lot of hardware people did as kids, including myself. Get your hands involved.

Look inside older stuff that predates 3d printing and cheap mold tooling, just to avoid the trap of everything being made the same way. In my case, since I'm interested in music, I've looked inside things like guitar pedals and amps, which often solve the problem of making something that's robust, but that can be made profitably in short runs and small shops.

Get a hold of the McMaster-Carr catalog, in paper form, and leave it in the bathroom. An old Digi-Key catalog if someone still has one.


The Cave Pearl Project (arduino based underwater data loggers, used for real science in cave systems) has a blog and several Youtube videos with info about ruggedizing electronics for humid environments and temperatures from freezing to about 60 Celsius. [1] Two words: conformal coating.

Lots of other good in-the-trenches reporting of hard-won knowledge in the blog. Many epoxy resins shrink significantly, for example. That may or may not be important for your project. The blog is not super condensed but it's worth reading, especially for seeing the evolution of design and construction practise from the early years (2011) to now.

There's a book, now somewhat dated, on the Protection of Electronic Circuits from Overvoltages (lightning strikes, or fridge motors, for example): [2] TVSes (transient voltage suppressors) are still in use, however. Even varistors.

Connectors are the bane of every electrical engineer's life. There are more designs of connectors than of any other category of component, and probably there are good reasons for all of them to exist. I haven't got any good references for this topic though.

Other things like fuses, fireproof insulation on on your power cables, physical design such that prying objects can't touch high voltages, and so are about protecting the rest of the world from your projects.

Rod Elliott's web site [3] is a mine of information for beginning to intermediate hobbyists. It's focused on analog, audio specifically, but when you get down deep enough, everything in electronics is analog. you need to know about resistance, capacitance, and inductance, earthing (grounding) layout, and other similar topics.

1. https://thecavepearlproject.org/2023/03/17/waterproofing-you...

2. https://store.doverpublications.com/0486425525.html Available on Amazon as an ebook.

3. https://www.sound-au.com/articles/index.htm


Lots of gatekeeping and snark in these comments. Every time something hard gets easier, the pure who suffered hardest come out of the woodwork to inform you that the easy thing you're doing is not as good as the hard thing they've been doing since you were in short pants.

Composition is great for prototyping and small-scale production. As you level up and learn about optimizing BOMs and DFM, you will start to swap out MCU boards for your own designs; you'll see how that $10 I2C rotary encoder can be replaced with $1 worth of resistors, capacitors, a Schottky diode and a hex inverter.

Anyhow, I came to say that with companies like JLBPCB and PCBWay offering 3D printing and CNC services, you don't even need to buy a 3D printer to get started.

Heck, with https://wokwi.com/ you might not even need prototyping components.


Totally agree with your comment on the gatekeeping and snarkiness. Also my current foray into hardware (as a software guy) tells me there's tons of low hanging fruit on the design rules side to cover all sorts of scenarios for "production ready" component selection, placement, and environment concerns.

My gut tells me that the software market that serves hardware engineers isn't nearly as creative or ambitious as that on pure software and even devops or infrastructure.

Huge opportunity there.


The problem is that 1) "design rules" are in practice more like guidelines, and often need to be violated in order to actually get stuff made, and 2) all the data is wrong, contradictory, and cannot be trusted.

A lot of the work of a hardware engineer is reading and interpreting datasheets and trying to separate the wheat from the chaff. The low-hanging fruit which can easily be automated is the easy part of the job, and writing the input data for the automation ends up taking more time than just manually doing it yourself.

I have dabbled into writing some software extensions for KiCad, and some turned out to be very useful and now save me quite a lot of time. However, every time I tried to be "clever" and solve a seemingly easy problem, it ended up not being worth it in the end.


You are not wrong overall, but I am not sure if the opportunity is huge, at least IMO not enough to sustain a VC-backed company (or perhaps barely). As a benchmark, Altium has a market cap of 6B. The fundamental problem is that there aren't that many HW engineers out there (compared to SWEs and SWE adjacents like DevOps etc). And the existing players are super entrenched into existing companies doing HW design.

There are some interesting companies out there that I am watching, like flux.io. The problem there is that none of these companies are working on creating open-source tooling, so their endgame seems to be getting acquired by Altium, Cadence et al.

I fear a future where doing even regular PCB designs will be gatekept by the Cadences and Synopysyes of the world, akin to how IC design is today. At least we have KiCad right now, which is getting really powerful and is fantastic for doing PCB development work.


I'm in a similar boat so maybe totally naive, but this seems true. I think the software industry is super rich in tooling because software engineers understand software, and can build their own software (haha..). Non-software fields have crap software because usually their only way to get some is to hire a software person who doesn't actually understand the industry. Introducing a communication barrier like this massively dampens productivity. Things are much smoother when the person using the software can actually dig in and fix the kinks themself instead of filing a Jira ticket.


Big companies buy solvers and support contracts, not interfaces. That's why everything has a crappy buggy interface slapped on it.


Interesting, I am in the same boat with the author. Recently have started to dabble more in embedded devices. I am building a well water level sensor. First I tried to use an NRF based board, but I got bogged down with the SDK ecosystem, it's really meant for experienced embedded engineers of companies. Then I fell back to much simpler ESP32-C3/S3 boards, which are great, widely supported, easy to set up and pretty reliable. I hook it up to the distance sensor (HC-SR04) and make the distance calculations work. You also have to add a voltage converter if you want to run from batteries, because the sensor requires 5V - easy enough after some reading and failing. Then you have a mess of boards and cables, you need to solder it to a board which requires tools and a bit playing around. Now I was missing an enclosure, tried a few store bought junction boxes, none were perfect, and I decide to buy my own 3D printer (the future is now, print your own things, learn modelling, etc). Those are actually pretty easy compared to everything else, I printed my first models <1h after receiving the printer. Modelling in programmatic OpenSCAD or my currently preferred tool CadQuery - easy to pick up in a few hours of playing around. So yeah, I have had my printer for exactly a week, and I have printed almost a dozen successful prints, and designed a couple of usable and functional parts. Don't be afraid of 3d printers, oh and also you can get good printers for much less than 500 USD, I bought a lightly used second hand Sovol SV06 for 150EUR (220 for new) and it works really well.

The idea is I didn't find an existing water-well sensor for my purposes, so I am building my own. Final price of BOM probably 20EUR. Time spent learning and tinkering - hundreds of hours. Cost of stuff I had to buy to support all this, probably somewhere around 500EUR now. (printer, connector crimpers, cables, MCUs, solder boards, sensors, battery holders, electronic components, filament for the printer, soldering iron, etc). It has all been worth it.


> There is a learning curve to 3D printing. It might be the steepest factor here, in fact.

I love CAD but man do I hate 3D printing. It's a type of device that seems to have been invented to illustrate Murphy's law that "anything that can go wrong, will go wrong".

The print nozzle gets clogged. Every. Time. The filament breaks at the worst possible position and requires some disassembly to remove. The printing stops for no reason in the middle of a long print. The plate is never exactly even. You forgot to leave the wire spool with enough free spinning and when the machine pulls on the wire it makes the spool fall, itself pulling the whole machine down with it as a vengeance.

And of course, it takes hours.

I have been much more lucky with external providers that you can send your file to, and they send an object back. It's often expensive and takes even longer than doing it at home (days vs hours), but there's no price for peace of mind.


That is just how physical reality is. Everything breaks all the time and your process needs to be resilient to it. It is definitely the case that consumer 3D printers make that practically impossible though. (I tend to think home CNC is more interesting for this reason).

For example, an industrial solution to some of the problems would be a checklist based inspection of the printer between every task, but this would be incredibly tedious.

Software that has to directly interface with reality also has these problems.


It feels like we are trying too hard not to learn the lessons of metal casting.

As metallurgy goes, casting is very old tech. I won't say it's the simplest thing, but it was simple enough that pre-industrial people were figuring it out, which says something about complexity.

Lathe work starts out with cylinders or polyhedrons, which

But we only carve shapes out of polyhedrons, or build them up from nothing. It seems more likely that for complex, concave or knotted objects, we should be using low resolution additive printing and high resolution subtractive printing in combination.

We may also need CNC machines with a an additional degree of freedom. Perhaps not a fully 'prehensile tail' but the ability to tilt the cutting head to say 45º would probably reduce the gap between additive and subtractive manufacturing's achievable shapes by quite a bit.


Getting a machine that works well out-of-the-box is critical. While tinkering and troubleshooting is definitely part of the 3d printing experience, my Prusa Mini requires very little babysitting and maintenance.


Could that be due to consumer vs industrial 3D printers? The external providers are probably using more expensive industrial printers designed for more frequent or continuous usage vs a typical consumer printer.


As a software-only guy, this article brings me great encouragement for doing a hardware project in the future! :)

Although, to be honest, my bigger problem is probably just simply not having a use case which I could use a self-built hardware project for. I don’t feel like I’m missing or lacking anything in my life or at home that could be fixed with a hardware project.

Additionally, I usually want the absolute best solution to a problem that I can afford. Commercial products have satisfied me well so far. My mindset about this is that if I can just pay someone for a product that solves my problem, I will gladly do so instead of scratching my head with a self-built project (I consider my time more valuable than anything else).

So I guess what really needs to happen to make me actually dip my toes in the hardware soup… is to have an annoying enough problem that cannot be solved with ready-made products on the market (either because they are bad or outright don’t exist).


Sounds like a typical content on today’s internet: enough buzzwords for search engine to find it and too abstract to be useful.


Blog posts don’t need to be useful.


This is a valid criticism, but I don’t think that it’s necessarily the author’s fault.


I think it's a flaw inherent to the current system. You have to make money to live and you do that, not by appeasing human readers, but by appeasing an algorithm. The world is not easily reduced into clear classifications but we're currently forcing it into them


> You can’t beat Prusa: the printers come out of the box working perfectly

At the cost of being old and slow. I wouldn't be throwing roses to Prusa after they effectively ceded the market to everyone else.

For $200, you can get a Sovol SV06 that's a smarter iteration on the MK3/MK3S (while also being open-source both in hardware and software); for $500 you can get a Bambu P1P that's much faster and has better vertical integration through the slicer (and for $100 more than that you can get a P1S, which is high-temp ready while also doing all the same things as the P1P).


I not sure I could.

I used to do microwave communications repair in the army. The most painful part of my education was basic soldering. I couldn’t solder for the life of me. I have the finger dexterity of a brick (which is to say none at all).

A few years I took a comprehensive career aptitude assessment, which included testing finger dexterity. I thought I’d done really well after taking the test. I was informed I scored in the bottom 5%. If I became a surgeon, my malpractice insurance would cost more than my annual salary.


JLCPCB PCB assembly service ([1]) is excellent and is really inexpensive. I used to reflow PCBs myself at home, but now I don't bother.

1. https://jlcpcb.com/capabilities/pcb-assembly-capabilities


That's not building hardware. It's connecting up and interfacing existing hardware components.

Which probably makes more sense than designing hardware components for most applications.

But it's not the same as designing circuits etc. and the title is a bit misleading as far as that goes.


Of course it is building hardware. It just isn't building all of it, which is what all of us do with everything we build, to some degree or another. I can run some wood through a CNC machine and I still count it as building something even if I didn't grow the tree nor cut it down nor kiln dry the wood nor cut it to exact size I needed for it to be put into a CNC machine.


Making a box of mac & cheese is still cooking— it's just not from-scratch cooking.


If anyone needs help making their hardware projects or products real and take it to market please feel free to reach out. Contact info is in my bio.


Saving your contact info :-)

I’m not there yet but I’m working on transitioning from software to hardware… so I want to get there eventually!


There are other starting points besides 3D printing, arduino/microcontrollers and spark fun sensors.

If you just need an enclosure for a product there are ready-made ones that you just drill and cut as needed. And for anything to do with sensing or automation look into industrial PLCs (automationdirect.com is the cheap supplier) before you start re-inventing the wheel.


So I’m posting this way late and I doubt anyone will read the comment. But I did a hardware startup once and was surrounded by other hardware startups in the space we were in.

Ughhhhhhhh operating system updates, internet issues, test kit from China that we had to use a specific version of cracked Windows XP and still do live support in broken English at midnight.

Hardware is hard - Never again!


Hardware building is an expensive hobby, and often involves aspects of engineering like heat, power, safety, etc.

I don't trust myself to build something that I can leave unattended and won't catch fire. How does one get over this?


I think outsourcing the “dangerous parts” i.e. buying a power supply instead of building one. Apart from that, most applications stay in the 5V range and few mA. If you are using something that requires more current then just over engineer. A motor that uses 0.5A? Buy a 3A mosfet, flipping 120v electricity? Buy a premade relay module that is already optically decoupled and just feed it 5V signals.

When it comes to stuff failing in general at hobby level you either burn something instantly (plug the power to an IC backwards and see the magic smoke go away) or it just heats up VERY VERY FAST!

I once plugged an external 5V power to a development board that was already USB powered but I didn’t know it… it started smelling like something was burning within a few seconds and I burned myself by touching it instead of pulling the heat camera :-)


I don't know if you ever do, but designing with fat tolerances helps. If you're just building something once, it often doesn't cost much more to over-engineer it. Choose a more powerful processor than you need, add more cooling than you need, use more fuses and power-supply filtering than you need. Opto-isolate all I/O. Test it in a hotter/colder/more-humid environment than you'll use it in normally...


To build on what others are saying, when you have a project, just learn what is dangerous and be careful when dealing with that. For example, power supplies, wiring with mains, high current or voltage for motors, or LIPOs. Once you identify those key areas you can figure out how to handle them in a safe way!


Outsource dangerous building blocks to qualified people, overpay for quality components, learn proper wiring (ratings, crimping, etc).


Use listed current limiting power supplies. A 10W 5V wall wart is incapable of starting a fire no matter how badly you screw up.


Helps to know, thanks


"Using open source code is a skill: knowing how to navigate repos and someone else’s code, understanding how to troubleshoot and navigate communities to get help, discerning between quality projects and junk… this experience is a hard-won component of being a modern software explorer. It can take you further than you might realize, past mere bits and into the land of electrons and atoms."

Very wise words. Coming from sw/hw industries I probably could work around heat pump microcontrollers without too much hassle and I well know the pain of physical components messing up your debug process. But such industries rarely rely on open source, and all the OSS I used was for personal projects. That is definitely a big limit for my future work opportunities! :/


A thing preventing people from going into hardware(prototyping) is the cost. Software is cheaper than hardware. i.e. i


Not exactly true. Many electronics manufacturers give out free samples. All kinds of free samples. When I was a kid (and even into adulthood) I would contact all the electronics manufacturers I could to get free samples. I had dozens of free Microchip PIC embedded CPUs and support chips. Back in the day Maxim semiconductor (now Analog Devices), and many others. I even got free stuff from Digikey, but that took some convincing of the right people at Digikey. Some of the products I begged and pleaded for - I got a full touchpad controller for free, including shipping, because I was a "student" and I was making a "prototype". It really wasn't that difficult to get free electronics to learn with. And for the passive components - resistors, capacitors, coils, and other parts there's always free broken electronics floating around, and I would harvest everything I could that I didn't already have plenty of.

https://www.microchip.com/samples/

https://www.analog.com/en/support/customer-service-resources...

https://reddit.com/r/electronics/comments/1qvcr2/how_to_prop...

https://www.ladyada.net/library/procure/samples.html


Sampling is a lot rarer these days. There is a reason your last two links are from 10+ years ago. Too many hobbyists tried to use sampling as a means of getting free parts for their personal projects.

Sampling is intended to get a sample so the company's expectation is that it will eventually result in an actual order. This will obviously happen when sampling to companies, and sampling to EE students means those students are more likely to choose your products when they enter the field.

Sampling to hobbyists doesn't really have any return on investment, so once they started getting thousands of requests they just shut it down. These days you are just expected to order low-quantity items from their distributors.


Getting free samples was always part generosity and part social engineering, even 30 years ago, before everyone in tech got an arduino. I'm fully aware of a company's expectations in giving away samples - even as a kid 30 years ago, but that doesn't mean a determined person can't still get free samples. The game hasn't really changed much. I'm just saying that the previous commenter's idea that it's expensive to get into electronics as a hobby is not exactly true.


Monetary cost is only small part of it (as it got significantly cheaper too, at least for small electronics).

The feedback loop is just very long. Few weeks to get PCB unless you pay a lot extra to get it in few days.

And even if you own a 3d printer for mechanical parts that's still day of printing


If you just want it to work you can use ESPHome which means you barely have to code at all. There are software modules for a lot of the standard hardware components. It's arduino plug and play.

I've built a device that times my espresso machine and controls the grinder. I build a 4x wifi socket just because it was fun. Getting high quality temperature and humidity values or co2 is unbelievably easy, cheap and fun.

Coupled with Homeassistant you can spend a lot of time and have a lot of fun. I did at least.


Is there anyone on earth not using open source in some capacity?


This group of people for one.

> the Sentinelese appear to have consistently refused any interaction with the outside world. They are hostile to outsiders and have killed people who approached or landed on the island.

https://en.wikipedia.org/wiki/Sentinelese

But more seriously I would say there is a difference between intentionally and incidentally using open source software.

I run Linux and FreeBSD on multiple machines. I use open source software intentionally.

My girlfriend runs Windows on her laptop. If we look closely I am sure we will find open source libraries being used both within the OS, and within other pieces of software that she runs. But all of that is incidental. She is not interested in software and that is fine.

My mother and my grandfather both use LibreOffice. But only because I installed it for them. So neither my grandfather nor my mother really are intentional users of open source software. It just happened to be the case that their grandson/son (me) knew about LibreOffice and installed it for them, so that they could use it to write documents and to open Word documents that other people sent to them.


Depends what you mean by "using" open source. If we include consumers of software that happens to have its source published but who couldn't compile it even if they downloaded the source (so, >90% of Chrome users, for example), then yes there are lots of non-devs. Likewise, there are probably still some devs using licensed libraries proprietary applications using proprietary IDEs and compilers, though it's certainly getting rarer.


Watching the penguin screen continuously reboot on Delta airlines’ janky in-flight entertainment system should not count as “Using Open Source”.


I would bet that every form of motorized transportation has open source in the build or operating model somewhere.


"If you build modern software, you’re well-versed in composition: grab a handful of existing projects—a database here, a UI framework there, an HTTP library to round it all out—and arrange them together. You write your custom logic—the stuff unique to your project—and let other people’s code do work that’s common across all projects."

This approach certainly gets tried enough. I'd say it has some issues, though.


The hardest part of the hardware experience for me so far has been the waiting. I recently took the next step in being a keyboard nerd and have been tinkering with custom macro pads.

Currently printing the bottom of a custom osu! pad for the third time after a couple goofs.

Absolutely a blast though, especially coming from doing purely software. Even if you're just doing prototypes, highly recommended.


I really like the software/hardware opportunities we have today. But headlines like this just invite negative comments. It's like saying "if you can read a book you can become a nuclear physicist".

Or even "You don't need to learn svelte!" (I love Svelte but statements like that are not helpful).


How hard is it to remake and improve a random component on any electronic device I own? eg the control panel on my microwave or my entire TV remote.

Would I need specific parts from the manufacturers?

Would dissecting the existing component give enough detail for me to remake without the (I assume proprietary/hidden) schematics?


uwave control panels are pretty simple - usually just some buttons, a display, maybe a rotary controller, and an embedded controller IC.

But you really do not want to be experimenting with custom control unless you know exactly what you're doing. Aside from the risk of nuking food and/or accidentally bypassing the door switch and microwaving yourself/partner/kids/pets/etc, most uwaves have huge power capacitors near the controller board.

An unplanned encounter with one of those can kill you.

Here's a sample circuit. It's not super-complex. But there's a lot to go wrong, and it's really not a beginner project.

https://www.electronicsforu.com/electronics-projects/microwa...

Remotes are basically the same with (usually) an IR transmitter, more buttons, and no dangerous power switching. It's not all that hard to clone one, but the hard part is making the tiny physical buttons and inventing a better UI.

https://www.youtube.com/watch?v=m7z4CU5mw9E


Maybe a bit optimistic but I don’t think it’d be to hard.

Most devices already use pretty standard components, a microwave for example would have “something” to switch the thing on and off. It might be a solid state relay or something like that. Maybe it has multiple, one to control the fan, light, motor to turn the things around, and the thing that emits the microwaves.

But once you figure out what signal is needed to start those (a bit of intuition and a multimeter might suffice) you are off to the races!

One you open a few house appliances it’s easy to see how they optimized for cost, so you seldom find fancy protocols or components unless they are absolutely necessary.

In a toaster over for example, you might find a temperature sensor and it would likely take a bit of fever engineering to calibrate the temperature to the voltage output (I’m assuming that is a cheap analog sensor instead of something that spits a digital I2C signal for example).

So yeah! It shouldn’t be too hard to hack your devices :-)


A lot harder than building your own from scratch.

When you are trying to improve an existing product, you first need to figure out what the existing part is doing. This is going to be incredibly difficult because you do not have access to the original documentation. Often it involves proprietary parts for which zero documentation is publicly available, and you are going to need quite expensive tooling to figure out what it is doing without those docs.

In general I do not really think this is viable to a beginner for anything beyond completely trivial product. A microwave is a really bad idea due to the voltages and currents involved (you can easily end up killing yourself). A TV remote is probably doable, but mostly because you can do that without opening up the remote at all and just need to look at the (often standardized) IR signals coming out.


This is half reverse engineering (understanding the existing part), and half engineering. The reversing can be quite difficult for less common parts/designs, and is partly a different skillset. But for standardized interfaces like an IR TV remote it can be pretty easy.


If you think building hardware is as easy as importing a library, you can burn your house down.


Ehh... Sometimes. I tried this modular approach with a project, some things worked very well, others not well at all. In particular I have a ton of EF interference noise in my audio circuit and no idea how to get rid of it.


I've abandoned my last hardware project when chip shortage started - it was close to impossible to buy quickly in prototyping volumes. Has it got better now?


Side question, are there hobbyist groups or meet-ups that one would recommend ?


What 'using open source' is some special thing people need to learn ? You install a program and you use a program, 'open source' changes nothing there unless you want to start modifying it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: