All The Significant Inventions/Discoveries Were Long Ago

Discussion in 'Science' started by impermanence, Jul 7, 2022.

  1. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    I'm not quite following this. What field exactly are you describing that has six new ways of doing things every four years? Certainly there is not six new ways of doing the math that you learn getting a EE degree. I am familiar with a couple of suppliers in the industrial automation business that change the marketing information for their products about every four years, but the technical fundamentals used in the applied microprocessor control systems in manufacturing don't have 6 new ways of doing stuff every four years.

    However, I 100% agree that an ABET engineering degree is massive overkill for gaining proficiency to run and maintain these systems or even to design them. There are enough examples of folks I've worked with that were somehow able to cruise through their programs with proficiency that enter the workforce where actual application of the knowledge acquired is substantially unimpressive. Conversely, I've worked with a smaller number of folks that have totally Good Will Hunting style brought it to the table.
     
  2. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    :applause:
     
  3. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    Some of this stuff can get a little tricky when it comes to troubleshooting it, but for sure, given a student with, let's say a 1980s equivalent ASVAB GT score over 110 then yeah, 3 months would be plenty of time to teach level 1 tech basics.
     
  4. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    Well, it's meant as hyperbole, to be real. But I'm not applying it to Maxwell's equations. I don't mean the fundamentals. I mean the application of them. To this day you'll find 40's era machines running 50's era rely logic being operated by modern robotic systems. There's many generations of tech all integrated into current systems.

    I worked in a time when digital control went from novelty to standard, and in that time process changed rapidly. For the first machines I worked on logic was read at board level with a probe. A few years later inputting the right bcd into dip stitches would output logic. Next it was programmatic. You could just look it up from a networked service device. Then it was encrypted. I had a CodeMeter licence that had to be updated every 2 months to prove to the vendor that I knew what changes they had made. The logic itself still worked on the same principles, but it's influence on the system changed considerably. More data collected, more unique outputs produced. That doesn't show signs of slowing.

    The ways we collect data, the ways we communicate data, the ways we process data, and the ways we produce output all change as we increase the efficiency of these processes with new technology.
     
  5. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    Nice. Fun reading. And an excellent snippet from a lesson where just one of the challenges with AI still remain. You've used this often teaching this kinda stuff I bet. Very nice. I like it.
     
  6. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    One of my favorites is the fish AI story. A research team was trying to get an AI to recognize a certain type of fish among many other types of fish. They fed it a whole mess of pictures of the fish, let it do its thing, and found it could correctly identify the fish like 80 percent of the time. When trying to determine how it was recognizing the fish the system keyed in on 8 specific areas of the belly of the fish. This turned out to be the fingers of the trophy fisherman holding up the fish to the camera...
     
  7. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    It must have been the case then that the AI engine had already learned a lot more about fingerprints than fish, right?
     
  8. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    Well that's the thing. It can only learn within the confines of the rules we give it to follow. Humans pick and choose which rules to follow, change rules, think they are following rules when they aren't, make happy mistakes and recognize them as such..etc.

    Then again, maybe the AI actually did identify a better process and we're just too stupid to recognize it yet...
     
  9. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    What does your computer take care of for you with respect to producing a "deliverable" all on its own? I assume you are are familiar with the meaning of the term since you claim membership in the engineering world.
     
  10. pitbull

    pitbull Banned Donor

    Joined:
    Jun 13, 2018
    Messages:
    6,149
    Likes Received:
    2,857
    Trophy Points:
    113
    Gender:
    Male
    7. Howitzers, Tanks, Warplanes, GPS-guided Cruise Missiles, ...

    No, I'm joking. :D

    What you listed are the most primitive basics. Today we have to dig deeper to develop something new. And of course, it also takes longer, although we stand on the shoulders of giants who invented and explored many things before, that's supporting us today in many ways.
     
  11. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    Additive manufacturing is one of the big recent advancements. Prior parts were made by casting, or pounding on materials, or cutting away material to form the desired shape. Some shapes are impossible to make with this way. You can't cut a void in a block of steel without a path for your tool through the steel. You can't cast iron around another more delicate component.

    Now many precision parts are simply printed. Some of the previous limitations are eliminated with this process. This is only possible through significant advancements in the discovery of new materials, coating processes and control systems.
     
    Grey Matter likes this.
  12. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
  13. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    Depends a lot on what folks are thinking of when they think of AI. I've run across some folks that sometimes have made the point that since a machine can make a decision to set the state of some outputs based on the state of some inputs then that in and of itself is artificial intelligence. Philosophically I don't have a problem with starting the definition at this level. But, it's not the same thing as considerations along the lines of superintelligence, apparently a required relatively recent term introduced to distinguish very likely upcoming stuff far different than the stuff achievable with electricity and a mechanical relay panel, signal conditioners, transmitters, limit switches, solenoids and contactors and vastly improved software paradigms if PLC and DCS microcontrollers are used instead of relay panels.

    I think more than just the algorithms, real AI, or superintelligence, is maybe somewhat limited by the currently available hardware's ability to provide a baseload capability equivalent to a biological brain and currently I think it's more than likely that we don't really know enough about how biological neural systems work to be able to have 100% confidence in how to equate something like transistor equivalent CPU counts to synapse counts, much less to account for variations in processing capabilities between push & pop stuff on a box versus push & pop stuff done by wetworks. There's a reasonable chance that superintelligence in a machine might present itself as a phenomenon that to all appearances is some sort of autistic genius that spends hours upon end fascinated by interconnecting streams of data input with no output at all.
     
  14. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    It wasn't too long ago that you could tell how long someone had been a machinist by counting the number of fingers they still had.

    Still a problem today, but much less of one due to major advancements in science and technology.
     
    Grey Matter likes this.
  15. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    The first closed loop system advancement in my opinion was the centrifugal governor. It could collect data from the environment, process the data, and affect a change in output. I wouldn't call it AI either. Reaction to feedback doesn't fully meet the definition of conscience behavior, which I think is the goal of AI.

    But we don't really understand why that happens in biological neutral networks. Boy would that be an advancement...
     
    Grey Matter likes this.
  16. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    AI will be in it's infancy until we can produce a system with an awareness of self, and ability to affect it's own purpose. Though that could lead to AI that doesn't feel like working today because it's mad over the outcome of the world cup.

    Douglas Adams' Marvin. Super intelligent, but a bummer to hang out with.
     
    Grey Matter likes this.
  17. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,436
    Likes Received:
    2,593
    Trophy Points:
    113
    Gender:
    Male
    One of my favorite safety topics is count to five. A machine shop reduced their hand injuries 90% by making everyone count to five before hitting the start button. Might be a good practice for driving a car as well, count to two or three before making a lane change or entering an intersection. Hell, maybe it might even apply to making a post here at PF.

    Curious, on a separate topic but still related to this thread - have you ever read Accidental Empires or even better, Dealers of Lightning?
     
  18. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    Reading? What's that?

    Heh.

    JK. Good literature? I'll have to check them out on audible.
     
  19. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    60,482
    Likes Received:
    16,555
    Trophy Points:
    113
    Interesting.

    Maybe putting down the cell phone, too?
     
  20. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,614
    Likes Received:
    2,492
    Trophy Points:
    113
    Gender:
    Male
    Well, that does not matter worth a damn. Autodesk was not writing that for some "self-employed architect", they were mostly writing it for corporations. Companies like Hughes, Boeing, Litton, Northrup, and other institutions that would spend the money for hardware like that.

    And it was hardly a "boat anchor". Most of the Internet ran on those "boat anchors" for decades, and real companies relied on them. I'm sorry you see them that way, but SGI systems were the top of the line for decades (and continues to live on in the top end HP machines).

    You whine about "self-employed", that is the bottom of the line really in that area.
     
  21. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113


    Totally just as good. You should trade in your current device for one.
     
    Last edited: Jul 30, 2022
  22. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,614
    Likes Received:
    2,492
    Trophy Points:
    113
    Gender:
    Male
    For big corporations, it would be.

    When I worked at both Hughes and Boeing, I would work in labs that had dozens of those in each, primarily running AutoCAD or some other specialized software. Of course, they were working on huge multi-billion dollar projects like the ISS, so money was not much of an object when speed was critical. I also saw them at Sony, Universal, and NBC. Same thing, when you are working on a deadline, waiting an hour to render 5 minutes of video on a PC was not realistic when an SGI could do it in real time (480i), or 2 or 3x real time for standard theater or DVD resolutions (720p).

    That is why Pixar was made primarily with the SGI in mind. When you are doing graphics for a movie with a $30 billion dollar plus budget, you do not do it on "standard PCs". But I also worked at smaller engineering and video companies. Where they would just made do with the best PC they could get, or have a dual CPU one made. I think he is just working at the "User Level", and never really worked with real "Corporate Grade" systems. And does not even seem to understand the point I was making in the first place.

    Yes, hardware comes first. But nobody will ever buy the hardware without the software. Especially an operating system. Both AMD and Intel waited until they got confirmation that the next "consumer level" OS would operate in a 64 bit environment (2001) before they finally worked on and sold 64 bit CPUs (2003 for AMD, 2004 for Intel). And while both were available, sales stagnated other than for Servers and those who used Linix until Vista finally came out.

    And yes, a lot of corporations and government got Vista. The military was running Vista until around 2013, when it moved to Windows 8.

    That is why a lot of systems died over the years. The TI 99/4A is a great example. Some of the best hardware of the era, but since there was little software that ran on it, it died. That is one of many hardware platforms that died because no software that people wanted ever followed it.

    Most people really have little concept of what computers are like once they pass the home level, or even what they themselves use. I sold 8086/8088 XT class systems for a decade, until the middle of 1995. Sure, the 80486 was out, but a hell of a lot of people saw no reason to get anything faster or more expensive. That is, until Win95 rendered anything that was not fully 32 bit (anything below an 80386SX) totally obsolete.

    Hell, I was still working on IBM mainframes into the early 2000's. Now granted, most of those died by around 1998, being replaced by more powerful desktop systems and LAN servers. But not because they were old, but because they and their software was generally not Y2K compliant, and many could not be upgraded or it was cheaper to just retire them and move to a newer platform. The last one I saw working at Hughes in 2001 only did a single thing, print checks. Of course, it was attached to a high speed check printer that I want to say spat out around 500 checks a minute. But they had essentially made the last mainframe it ran on a "dumb print server", and that was all it did. Because the printer had not been made in years, and there was nothing else on the market that could replace it.
     
  23. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,614
    Likes Received:
    2,492
    Trophy Points:
    113
    Gender:
    Male
    Oh dear me, you think essentially using a PET as a dumb terminal makes it "modern"? Holy hell, that is pretty much all that those early generation 8 bit computers were. Slightly smart dumb terminals. Hey, I can play a modern game like Fallout or anything else on my phone! Just load the software, run it on my computer and the phone then plays the game. Well, not really. The game is still played on the computer, the phone is just acting as a remote terminal and not actually doing any of the actual processing of the game.



    You have got to be joking if you think that means anything.

    And in case you did not know, the "Blix Term" is doing all the work. It literally is a Raspberry Pi computer, that is only using the PET as a terminal. That is literally as impressive as finding an HD decoder and hooking it up to your old NTSC tube TV. In other words, not at all. All he did was hack a Pi and get the display to output on a PET.

    Sorry, I ain't giving up my computer for a Pi. I'm sorry if you got so excited seeing YouTube on a PET, that you apparently did not actually understand what he was doing or how he was doing it. And with only a 512x512 resolution, I would be better pulling out a first generation VGA monitor than using that as a display.
     
  24. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,797
    Likes Received:
    3,781
    Trophy Points:
    113
    You're not really paying attention to the conversation, are you?
     
  25. FreshAir

    FreshAir Well-Known Member Past Donor

    Joined:
    Mar 2, 2012
    Messages:
    151,312
    Likes Received:
    63,465
    Trophy Points:
    113
    Artificial Intelligence could bring many new Inventions\Discoveries

    but will people lives be easier or harder in the future....
     

Share This Page