Another Win For Linux

My special interest is computers. Let's talk geek here.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

You know way too much about changing light bulbs on a tower. LOL I can see why the design of tower lamps would change to accommodate automated maintenance. As I noted earlier, it's safer and cheaper. The drone as I see it would merely be the transport device that carries a robotic arm up the tower for light bulb changing. They use these kind of arms on space ships, so why not on terrestrial towers too? I've recently seen robots making pizza which appears to be a little more complicated than changing a light bulb. All I know is a lot of people are going to be without jobs when the bots take over.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

A tower can be built with a Transport Channel running up one of the legs. Or they can add one later.
I haven't seen one yet that had an arm on the truck that moves up and down the Channel, but it does carry the toolboxes, and I suppose lights. A worker cannot use it to ride up and down on though, not designed for that much weight and has no safety features like an elevator.
It has two drive wheels that fit on the outside of the channel at the bottom, and a long brace above the box that fits into the slot at turns 90 degrees so the rollers are inside the channel. About the heaviest thing on it is the battery that powers it.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

Maybe my thinking is too advance. More likely I don't know about all the complications involved. LOL Those transport channels seem like the ideal method for getting a remote light bulb changer up the tower. Perhaps remote controlled robots haven't been as sophisticated as they are these days and might have been cost prohibitive. It just seems logical to do it that way and eliminate human intervention altogether.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

I agree, but realize when I was climbing towers, it was back in the early to mid 1970s, and it only cost about 250 bucks to 500 bucks to have all the lights changed, plus the cost of the lights whatever they cost.
Today with it costing around ten grand for someone to climb that tower, they reserve that cost normally for antenna repair folks. Many of them will take an extra 200 to 300 bucks on the side to carry a lamp with them to replace while they are up there. And the new lamps are small and lightweight compared to the ones I had to change out.
FWIW: Although we all carried company walkie talkies, most of the time you could not get a signal once you passed the 350 foot mark. Too much metal up there, and you were usually inside the tower framework until you neared the top, or stepped out to replace a side lamp.
We used small nylon line, like fishing line, to lower the spent bulbs down to the ground, and hoped the wind didn't knock them against the tower and break them. Most were reconditioned several times before going with a new one.
We used reels similar to fly fishing reels to lower the lamp down. If no one was below to take it, we would tie the line off until we got the second lamp on the ground, then until the first line, tie it only to the second line over a cross rail, so when we were on the ground again, we could remove the lamps, tie the line back to the spool and wind the whole thing back up again. But normally, we had a man on the ground so we could climb without pulling lamps up with us, and they could until the lamp we let down and we could wind the line back up on the spool for the next lamp.
It was normally nearly a whole day job to change out the seven lamps.
And you had BETTER have had a good early morning consecrational before starting that climb, because things inside tend to work themselves loose as you climb if you catch my drift here, hi hi.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

I can imagine the call of nature would be a problem when you are shimming up a pole many hundreds of feet off the ground. Why being that high off the ground of itself would encourage such movement is slightly perplexing. It's not like you are up there against your will and frightened nearly half to death. I suppose barometric pressure changes, but not that much. I dunno. It seems odd, but perfectly believable.

Your comment about walkie talkies is interesting. I'm thinking even those 100 mW versions would have a quarter mile range and most likely the industrial variety were 5 Watts or more. The radiation pattern of most walkie talkies, if I recall correctly, is cardioid which broadcasts in all directions except down. Then again, I only know what Motorola did. You WERE using Motorola radios, were you not???
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

As far as I know, Motorola was the only company that made a lot of things.
Heck, even the monster sized telephone transceiver in the trunk of my car was a Motorola.
This was also back before the days of Rubber Duckie antenna's.
The walkie talkie would beep for an incoming signal, but you had to extend the LONG antenna to talk on it.
The best part of the design of the walkie talkies we had was the antenna went entirely inside the unit when closed.
We were told never to key up a transmission without the antenna being fully extended or it could damage the radio.
So maybe they were five watt units. I don't really know, nor do I know what frequency they worked on.
I do know the bosses radio had three channels, and he was an impatient old cuss too, which is hearsay from my uncle. i never met the boss, only his route man who assigned us our jobs.
The leather case on the walkie talkies were numbered 1, 2, or 3. Never saw one with a number besides these.
When the route man handed us a walkie talkie, he also handed us a paint paddle with the same number as our radio.
This was placed in a bracket on the panel box at the base of the tower. I assume this was so the boss, if he ever patrolled where we were working, would know what frequency the guy on the tower was assigned. Not that the radios worked once you were up past the 350 foot mark.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

Motorola derived it's name from the car radios it made - motor+radio, or something. That was in the thirties. Their big, really big, claim to fame came during the war where they supplied every walkie talkie the military ever used. Well, maybe not every one, but damned near close. By the time I was hired Motorola was into many different things and the hand held radios were all made down in Florida. That's probably the case to this very day. I am fairly certain there were two bands of frequencies: one somewhere around 147MHz and the other around 450MHz. I know they also had a third band for the old time radios and I think it could have been 67MHz. Not sure anymore. It's all kind of fuzzy, plus I did not work for the walkie talkie division. Most of my career was involved with radiotelephone units.

There is a lot more competition these days, but when I first started with the company you are correct to say that any service radio in America was made by Motorola. Western Electric was the distributor for radio telephones in the Chicago area - maybe other areas too. Each year they would enter into a contract with some manufacturer which presumably offered the best radio for the price. Invariably Motorola would win that contract even though the Japanese manufacturers were offering something significantly cheaper. The Wester Electric people justified rejecting the off shore radios on grounds that it was a security risk. Talk about sweetheart deals.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

I can tell you one thing, although it took up half of my trunk, I really enjoyed having a mobile telephone in my car. Glad I'm not the one who was paying for it.
While working for MRTC after they gave me a promotion, I was often traveling down to southern Missouri.
It was mandatory for me to have communications with the company and with my department.
I was allowed up to four very short personal call per week, but never used more than one, if that. Didn't want to press my luck with the company.
I knew where the Ham Radio repeaters were located, and could contact via relay back to home when I was out, and the frau would always get the message. It was super rare for me to have to contact here. Only if there was a serious situation at one the plants where I would have to stay overnight. But all the plants had landline phones I could use, so no need to use the mobile phone.
But once you get out away from the plants, following the pipelines, and spotted a problem area, then I needed to call the company immediately if not sooner.
We also had planes that flew sections of the pipeline every single day, and if they saw a questionable area, then I had to go check it out on the ground.
Although I was quite active with them installing the third pipe line, only twice in my five years working for them did I have to have one of the two mainlines shut down to make repairs before something went kablooey, hi hi.

The natural gas in the main pipelines have no odor added, that is done by the distributing companies before they send the gas to retail and wholesale clients. In other words, residences and businesses.
High pressure mainline gas lines don't usually rupture, unless there was a catastrophic event like an earthquake, and even then there are safety shutoffs before and after a fault zone that will shut the pipe down if there is any pressure differential noted by the sensors.
That being said, as thick as those pipes are, sometimes they manage to find a wormhole leak in one of the welded seams. Every pass of welding is X-ray examined before the next pass goes on. But somehow, a little pinhole might develop in a weld from expansion and contraction, or just vibration too.
Since the pipes are under such high pressure, a little pinhole or two is not enough to cause a pressure differential, but enough of a leak that they must be fixed.
The tell-tale sign of gas leak is a small patch of dead grass, which the pilot reports. Most of the time the area he spotted from the air was not dead grass, but a sandy dry area. Even so, I had to check every report he turned in.
We had such delicate and accurate sensors that it could pick up a gas leak so small it wouldn't even kill the grass above it if any. It was a boring but interesting part of my job. Got me out of the office more than the rest of the guys, hi hi.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

I can't say that I touched every one of those trunk radiophones, but I did work on the only production line in Motorola that manufactured them. Chances are I inspected or tested something you had in the trunk of your car. Maybe only a 30% chance, but still. :mrgreen:

I've read where there is a major fault line going through Tennessee somewhere. There are pipelines crossing it as well. They say that some day there will be a huge disaster because that pipeline feeds a good part of this country. Then again, they say California will break away and fall into the ocean and that Yellowstone is a major volcano just waiting to erupt. I wouldn't miss California too much, but the dust from that volcano could hang around for many years.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

The three gas lines owned by MRTC ran from Landa Wascom and Landa Woodlawn gas fields up to St. Louis. The only fed a couple of small cities along their route.
I worked for them back in the days before computers, so don't know how they control things now.
But back then, most of the moving parts of the pumping stations were open and worked like a piston. They were amazing machines to watch run.
But as far as safety along the pipelines, there were mechanical differential sensors every ten miles, unless it was near a possible area that could be disrupted by earth movement, in that case sensors were only a mile apart, except right on a fault line. There were sensors every 200 feet each way of the fault line for at least a mile and sometimes for up to five miles, spread a little further apart the further away you got from the fault lines.
The most expensive of all the valves on the pipeline were those on either side of a fault line, design to close instantly using heavy weights to close the gates. There were also safety slam gates inside the pipes in areas where sensors were impractical. If one of these closed for some reason, it was a royal pain to get them to pop back open again.
The company did not like using slam gates, but they were required by the federal government in the pipelines before it passed through a populated area, which usually meant a large town or small city.
How they worked was simple. If the pressure on each side of the gate was equal, they stayed open. But if the pressure on the outbound side of the gate dropped too low, they would slam shut.
This sounds all and good for protection against a broken or leaking pipe. However, sometimes a city that was fed from one of the laterals would draw more gas from the pipeline than expected, and this would cause the slam gate to shut, turning off everybody beyond that point on that pipeline.
A city might open a valve too much to fill their reservoir tank faster, and cause a slam gate to shut.
And the only way to get it to open again was to open a bypass bleeder valve to get the pressure back up on the other side.
This usually meant closing a main valve further down the line first until the pressure was back up on the outbound side to let the gate open. Several valves along the pipeline beyond also had to be adjusted to keep other slam gates from closing, so it was a major undertaking to get things up and running again.
This is another area where the sensors come into play, they help control the flow of gas through the pipe to prevent the slam gates behind them from closing. What sounds simple on the surface, turns out to be quite complex in its operation.
But that's what happens when the government poly-tick-ians who know nothing about pipelines pass laws on how they have to be secured.

And while I'm hitting on the government. More often than not, when you hear about a gas line explosion somewhere, all the blame is placed on the pipeline owner, when in fact, the explosion was caused by some government mandated device.

Think about a water line. If you slam a water line shut, you will get a Hammer in the pipes, which could blow a pipe apart. In most homes, anyplace a device is controlled by a solenoid, such as your washing machine or dishwasher, and anti-hammer column is installed on the water lines to prevent Hammer, but it also acts as a shock absorber too.
There is no such thing as a shock absorber column on a gas pipeline, and it would be illegal to install one besides.
Newer slam gates are actually several gates of decreasing size closing one after the other, like layers of different size washers on top of each other. Much more expensive and complex, but still work mechanically if there is a pressure drop.

Almost any time you hear of a gas line explosion it was caused by someone damaging the pipeline and not reporting it.
Believe it or not, a small scratch in the pipeline coating can cause it to rust through fairly fast, even as thick as the pipes walls are. All of our pipes were first coated with a 3M epoxy coating, then wrapped with a fiberglass sheath coated in resin, then wrapped again in Oil Soaked Kraft Paper, and over that was usually another Kraft Paper wrapping. So it takes a major hit to get through all of that. Water Well digging companies often hit the pipelines, even though they know where they are, they try to go close to them so it is easier to drill I suppose.
Road work sometimes damages a pipeline when they dig too far down to dump the gravel base.
But more often than not, it's not the mainline that blew, it is a lateral which is owned by some other company.
In St. Louis, all the lateral lines connecting to MRTC's mainline are owned by Laclede Gas Company, who must maintain their own lines. And FWIW there is multi-layered slam gate on their pipeline within three feet of a lateral connection point, and MRTC has an automatic, usually weight driven, gate valve just before the connection.
Contrary to popular belief. A pipeline break is usually a minor break and rarely if ever is accompanied with a fire.
Technically, natural gas is non-flammable until mixed with air in the right proportion to burn.
I've seen some pinhole leaks which were lit on purpose to prevent exposure to the raw gas.
The flame may never get closer than five to ten feet away from the pipeline, but really lights up the sky, hi hi.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

Forgot to mention, another check came from PCH, this one for ten bucks. Strange thing though, it was dated 5/30/19 which is before I received the one for 25 bucks on 6/12/19.
The envelope was inside another envelope this time, and still wet, like it got rained on somewhere enroute.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

That's quite an elaborate fail-safe system you describe for pipelines. As far as criticizing government regulations go, there are two sides to that coin. I'd hope the Federalis would have the best of the best working for them. That was the goal at least in administrations prior to the current one. The regulations are intended to protect the people and you probably are right to say they are more than is needed to run an efficient pipeline. Left to their own devices without any oversight is very tempting for a company to take shortcuts in order to preserve profits; safety be damned. I'd like to think that doesn't happen but I also know that is one of the purposes for regulations.

My congratulations go to you for out witting PCH once again. I won on three lottery tickets recently which paid a total of $11. I added a dollar to that and bought six Power Ball tickets wherein I had the fewest number of hits I can remember from such a large lot. I got nothing from that draw. It just reinforces my thinking that the tickets generated by machine are not exactly random. The lotto drawing is. I'm thinking that I'm probably breaking even over the long term and it's about time I won something big. I'm going to switch strategy next time around and come up with numbers of my own. I know for sure mine will not be random, not that it will matter. :grin:
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

None of the medium to big lottery winners ever used the machine generated numbers, they all picked their own.
The granny down here that won the 4.8 million dollar lottery used the exact same numbers every week for something like 3 or 4 years on her lead ticket. She also entered the smaller lotteries using the same approach and won small amounts several times. Said she was only in the hole the first year they started the lottery here, but has been ahead of the game ever since. She did add that she only played the 4 number lottery the first couple of years which always had several winners. She never hit even a small winner when she was selecting random numbers.

I have earned over 21 million tokens on PCH and decided to use almost all of them for a chance to win a new Roomba. I also entered several tokens on other gift prizes, but less than a million tokens on all the rest combined. I picked a few I thought folks might not be interested in, unless they read the fine print. Such as a dorm fridge. In the fine print, you can take the 150 cash instead of the fridge. There were a few like that, some say you win cash to buy the item. I don't plan on winning any of them, not even the Roomba, because I'm sure there are over 6,000,000 entries for it. Each entry cost like 3,000 tokens, but I figure I've got over 7,000 slots in the draw.

Good luck on your lottery cards, I've never done well on them.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

I violated my own rule this morning and bough a lottery ticket with machine generated numbers. That's how I won the $11, but I am not confident Ill get anything out of what I did this morning. I was in a hurry and didn't have time to do what I said I would. It's not a big deal this time around because the payout is only $137 million, or $54 million in real cash. It's hardly enough to buy a new car. LOL

I've heard the same story a few times where people have a fixed set of numbers they play consistently. Those are the ones that win. We'll see how that goes next time I buy tickets.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

I learned something new, well sorta about computers.
A server farm is absolutely nothing like a supercomputer or a mainframe computer for that matter.
A supercomputer, despite it's thousands of CPU's, only does one job at a time, normally. Mainly used only for scientific research where a lot of numbers need to be crunched.
A server farm is basically a distribution center for requested files, they can be doing a million jobs at once, because all they are doing is taking a request, finding the file, and sending it to the person who requested it.
A mainframe computer is more like a glorified desktop, it does any and all types of jobs, from crunching small jobs to distributing requested files. And has a GUI!

Why was this a surprise to me? Because the few times I went out to tour ORNL's supercomputers, I was led to believe they were doing many different jobs for many different people at the same time.
Well, that is only partially true. It's not like these ten people are using 10% of the resources and those three people are using 80% of the resources. They may all be using the supercomputer, but they are lined up like folks at a ticket window waiting their turn. The jobs the ten people are doing may only take 10 minutes to do, 1 minute each person, while the job the three people are doing may take four hours each. So, if you are behind one of those, you wait the four hours for your turn to do a one minute job.
However, they also have a server farm at ORNL as well.

There is no GUI on a supercomputer, it's all strictly command line code.
This does not mean the code which is output from the supercomputer cannot be displayed through a desktop via it's GUI, but the desktop would need the program to convert the code it receives into an image. It's not normal though.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

mmm ... well ... yeah. That's all correct. LOL

An excellent example of a distributed server farm is good ol' Google search engine. It does exactly what you say, i.e., take in queries and spit out records to match. In many ways their farm is like a single brain the size of the planet - they have several farms at several different points on the face of the earth. When a query comes in it goes to all of them which is why you can get a gazillion results in one pico-second. I don't think Google's server farm has GUI's either. Most don't even have terminals.

The supercomputer sharing is a lot like astronomical observatories. Everybody who needs time on the Hubble Space Telescope, for example, signs up and gets put on a list. It's obvious how a telescope can only perform one task at a time, which is what supercomputers do. Likewise the output from Hubble is just an enormously long string of numbers. The scientist has to take those numbers to some computer somewhere (probably a supercomputer) and analyze the data for significance.

That's all going to change in the not too distant future when quantum computers become commonplace. At first every institution that needs such a computing device will be able to afford one. Not too long after that, when prices come down and technology advances, the hackers will have easy access to them too. By then, we all hope, the need for passwords will be unnecessary. A quantum computer will be able to brute force every password ever used by humanity quicker than it takes to bat an eyelash. Imagine having a computer like that in your office. You might not need it, but it will be possible not too many years from now.
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

From my tours of ORNL, I was left with the impression that work on their supercomputer was allocated in many chunks, depending on the needs of the user. For example, if one project only required the use use of lets say 100,000 CPUs, and the next user only needed 50k CPUs, and the next user needed to use 300k CPUs, that they all ran simultaneously. In other words, more than one users data was being crunched at the same time.

But from the other articles I've read, supercomputers only do one job at a time. And the user only needing 50k CPUs would still get the use of all 500k CPUs and his work would be done in a split second, rather than in 5 seconds lets say.
So all work orders are just daisy chained one right after the other.

The thing that makes it even more confusing for me is I can actually sign up to use the supercomputer for something as an individual, and they have slots for individuals to make use of the supercomputer.
I couldn't actually do it, because I do not understand their instructions on how the data to crunch must be presented. Way over my head that's for sure, hi hi. But many of the college kids working on some things apply and get slots for their program to run, and they get back a file of output data.

On another note: Like Glenn, I was with BOINC for a short time after I got set up on Linux, although they can use any computer that is sitting idle regardless of the OS.
After I bought the new HP and started using it, an old Dell computer I bought used was really too slow for a lot of things I did, only had 1 gig of memory, actually two 512meg chipsets. Apparently, the amount of memory I had on my computer was not so relevant to them.
I installed their program and set it to accept all jobs 24/7 unless I hit the suspend button, and gave it 15 to 30 seconds to complete the job it was doing. You could suspend instantly, bet then you didn't get the credit for a completed job.
The only way I knew they were sending jobs to it, was because of my Linux System Monitor showed was set to show CPU usage, Memory usage, and Network usage.
Of interest is the Memory usage never seemed to go up beyond where it went when I started the program.
Network usage would only occasionally have a few large spikes, but did have a steady pulse but not much different than when I'm online myself.
It was the CPU usage I thought would be more. I knew when it was doing a job because CPU usage might climb to 75% for a few seconds, then settle down at around 50% usage while the program was running. Then it would drop back down to just above zero for anywhere from 5 minutes to an hour before another job came in.

I never really did know what BOINC was all about. When I signed up for it, I thought they were looking for signals from outer space like studying pulsars, but I selected open to all projects a couple of months later when I didn't see much activity or jobs coming in.

However they are doing whatever it is they are doing has always remained well above my head.
For a couple of months when I had three unused computers, I loaded their program on all three and set them to run any project also. It seemed at the time that somehow all three would be active at the same time, and the Monitors showed about the same activity level on each, then they would all stop about the same time for fifteen minutes or so.

I imagine, if they have 1000 people signed up, it's like giving them 1000 CPUs to crunch data on.
Seems like what they do could be done on a supercomputer in a few seconds, but I guess it is cost prohibitive.

But here is my closing thought on the matter. Why can't I use ALL of my computers at once to do the same thing, as if they were together in a cluster? OK, I probably can if I knew how to do programming, hi hi.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

From what I understand BOINC did indeed start out as a project to analyze signals from outer space, SETI - search for extraterrestrial intelligence. But the bitheads at Berkley weren't happy just doing that, so they got into other scientific ventures as well. Like yourself, I don't understand how they do it. Not the details anyway. It's basically RPC (remote procedure calls) gone wild. LOL It would be like you allowing me to log into your computer so that I can run a program while you are out eating dinner, or something. If I can get 100,000 people to agree to the same thing, then I'd have one hella computing power at my fingertips. Glen told me about his participation, but I never warmed up to the idea. That's because I don't trust anybody using my computer if I don't know them personally. Those geniuses at Berkley must be respected and I'm not giving them an opportunity to use me to suit their own purposes, good or bad.

And, yes, you can use all your computers to emulate what BOINC is doing. I know you are clever enough to do a remote login and run one job at a time. But to coordinate ten, twenty, or a hundred might require some additional skills. LOL
User avatar
Kellemora
Guardian Angel
Guardian Angel
Posts: 7494
Joined: 16 Feb 2015, 17:54

Re: Another Win For Linux

Post by Kellemora »

Glenn had accrued one heck of a lot of points with them he probably never got a chance to use for anything. But then too, there is a high probability after he knew his days were numbered, he may have cashed them in.

I had almost forgot, he gave me a different URL for them to download a different type of program, and I put it on another old computer, so they were actually using two of my computers. After I bought the two matching iMicro housed computers, and changed things around up here, got rid of my two oldest computers BOINC was using, and the computer I was using when I got the two new ones, I set aside for accounting and it was not connected to my LAN. I really was a lot more paranoid about my data back then than I am now. Then after getting hit with the ransomware attack through my frau's computer and it was able to hit all the NTFS drives connected to my Linux computers, I now duplicate my data on each backup drive. I have half the drive as NTFS, and half the drive as EXT4. Both partitions are identical, and I save to the EXT4 partition, the copy that partition to the NTFS partition using Rsync. I hope this keeps my data safe!

When I attempted to build a cluster with separate computers I used a set of directions almost identical to these, but couldn't find them again to get a link. But the text of this link reads about the same.
https://www.linux.com/blog/building-beo ... t-13-steps

I either never got it to work, or didn't know how to use it properly.
The sample programs they had on the website to run for testing purposes all worked fine, so I assume it was set up right.
But I think it would only run executable files, which wasn't what I was after.

In retrospect, what I created was probably a mini-supercomputer which could do only one code computing job at a time.
When what I wanted was to make my old computers work together like a newer faster computer.
Like surf the web using web browsers, and doing things through the web browser like playing Farm Town, hi hi.
No such luck.

I guess I'll just have to figure out how to afford an 8 core computer with 64 gigs of memory, not that I would ever use it to it's potential playing Farm Town, hi hi.
User avatar
yogi
Posts: 9978
Joined: 14 Feb 2015, 21:49

Re: Another Win For Linux

Post by yogi »

So, basically, what I get out of reading that article is that the cluster the author describes is a server with each client being a separate process that contributes to the whole. I guess that clustering a pile of old computers would be better than adding them to a landfill, but I have to wonder how much better. Many years before the first multi-core processor was fabricated, we reached the physical limits of silicon computing power. We are used to judging a µP by it's clock speed and data bus width, but, as I say, real throughput is limited by the physics of things. If I recall the number correctly, the most we can expect to clock through a µP is 1.3 GB a second. Those clock speeds that are advertised to be any more than that are fictitious. When they say your processor can clock at 4.2GHz, for example, that's the equivilant throughput of 8 cores (16 threads) running as many different processes simultaneously. The real throughput of each individual core remains at the physical limit of just over 1 GHz.

What I'm getting at here is the collection of not-modern processors you may have in your closet cannot perform anywhere near the speed of something modern you can buy today. So, if you cluster ten of them together, you might be able to perform as well as a dual core AMD Athlon, which isn't shabby. But, you'd be better off getting the Athlon processor than running ten power hogs in parallel.
Post Reply