One place for hosting & domains

      What is Debugging?


      In computing, debugging is the process of looking for and then resolving issues that prevent software from running correctly.

      Software bugs refer to an error or fault in the codebase that leads to an unexpected result, or unintended outcomes. Because of this naming convention, the process of discovering and fixing bugs is referred to as debugging.

      The famous historical precedent for referring to glitches as “bugs” comes from computer pioneer Grace Hopper’s account of a moth being trapped in the Harvard Mark II electromechanical computer, and subsequently being taped into the logbook. The etymology of “bug” in reference to an engineering defect predates this account from the 1940s, however, having been a part of jargon since at least the 1870s.

      There are a number of systems and tools that can be used to address the occurrence of bugs in software and solve them through debugging. Tactics may include interactive debugging, unit testing, integration testing, and monitoring. Software development tools, practices, and programming languages themselves offer support in debugging. To learn more about debugging, you can read how it is approached in Python in our series Debugging Python Programs.



      Source link


      Leave a Comment