One place for hosting & domains

      How To Use the MySQL BLOB Data Type to Store Images with PHP on Ubuntu 18.04


      The author selected Girls Who Code to receive a donation as part of the Write for DOnations program.

      Introduction

      A Binary Large Object (BLOB) is a MySQL data type that can store binary data such as images, multimedia, and PDF files.

      When creating applications that require a tightly-coupled database where images should be in sync with related data (for example, an employee portal, a student database, or a financial application), you might find it convenient to store images such as students’ passport photos and signatures in a MySQL database alongside other related information.

      This is where the MySQL BLOB data type comes in. This programming approach eliminates the need for creating a separate file system for storing images. The scheme also centralizes the database, making it more portable and secure because the data is isolated from the file system. Creating backups is also more seamless since you can create a single MySQL dump file that contains all your data.

      Retrieving data is faster, and when creating records you can be sure that data validation rules and referential integrity are maintained especially when using MySQL transactions.

      In this tutorial, you will use the MySQL BLOB data type to store images with PHP on Ubuntu 18.04.

      Prerequisites

      To follow along with this guide, you will need the following:

      Step 1 — Creating a Database

      You’ll start off by creating a sample database for your project. To do this, SSH in to your server and then run the following command to log in to your MySQL server as root:

      Enter the root password of your MySQL database and hit ENTER to continue.

      Then, run the following command to create a database. In this tutorial we’ll name it test_company:

      • CREATE DATABASE test_company;

      Once the database is created, you will see the following output:

      Output

      Query OK, 1 row affected (0.01 sec)

      Next, create a test_user account on the MySQL server and remember to replace PASSWORD with a strong password:

      • CREATE USER 'test_user'@'localhost' IDENTIFIED BY 'PASSWORD';

      You’ll see the following output:

      Output

      Query OK, 0 rows affected (0.01 sec)

      To grant test_user full privileges on the test_company database, run:

      • GRANT ALL PRIVILEGES ON test_company.* TO 'test_user'@'localhost';

      Make sure you get the following output:

      Output

      Query OK, 0 rows affected (0.01 sec)

      Finally, flush the privileges table in order for MySQL to reload the permissions:

      Ensure you see the following output:

      Output

      Query OK, 0 rows affected (0.01 sec)

      Now that the test_company database and test_user are ready, you’ll move on to creating a products table for storing sample products. You’ll use this table later to insert and retrieve records to demonstrate how MySQL BLOB works.

      Log out from the MySQL server:

      Then, log back in again with the credentials of the test_user that you created:

      When prompted, enter the password for the test_user and hit ENTER to continue. Next, switch to the test_company database by typing the following:

      Once the test_company database is selected, MySQL will display:

      Output

      Database changed

      Next, create a products table by running:

      • CREATE TABLE `products` (product_id BIGINT PRIMARY KEY AUTO_INCREMENT, product_name VARCHAR(50), price DOUBLE, product_image BLOB) ENGINE = InnoDB;

      This command creates a table named products. The table has four columns:

      • product_id: This column uses a BIGINT data type in order to accommodate a large list of products up to a maximum of 2⁶³-1 items. You’ve marked the column as PRIMARY KEY to uniquely identify products. In order for MySQL to handle the generation of new identifiers for inserted columns, you have used the keyword AUTO_INCREMENT.

      • product_name: This column holds the names of the products. You’ve used the VARCHAR data type since this field will generally handle alphanumerics up to a maximum of 50 characters—the limit of 50 is just a hypothetical value used for the purpose of this tutorial.

      • price: For demonstration purposes, your products table contains the price column to store the retail price of products. Since some products may have floating values (for example, 23.69, 45.36, 102.99), you’ve used the DOUBLE data type.

      • product_image: This column uses a BLOB data type to store the actual binary data of the products’ images.

      You’ve used the InnoDB storage ENGINE for the table to support a wide range of features including MySQL transactions. After executing this for creating the products table, you’ll see the following output:

      Output

      Query OK, 0 rows affected (0.03 sec)

      Log out from your MySQL server:

      You will get the following output

      Output

      Bye

      The products table is now ready to store some records including products’ images and you’ll populate it with some products in the next step.

      Step 2 — Creating PHP Scripts for Connecting and Populating the Database

      In this step, you’ll create a PHP script that will connect to the MySQL database that you created in Step 1. The script will prepare three sample products and insert them into the products table.

      To create the PHP code, open a new file with your text editor:

      • sudo nano /var/www/html/config.php

      Then, enter the following information into the file and replace PASSWORD with the test_user password that you created in Step 1:

      /var/www/html/config.php

      <?php
      
      define('DB_NAME', 'test_company');
      define('DB_USER', 'test_user');
      define('DB_PASSWORD', 'PASSWORD');
      define('DB_HOST', 'localhost');
      
      $pdo = new PDO("mysql:host=" . DB_HOST . "; dbname=" . DB_NAME, DB_USER, DB_PASSWORD);
      $pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
      $pdo->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
      
      

      Save and close the file.

      In this file, you’ve used four PHP constants to connect to the MySQL database that you created in Step 1:

      • DB_NAME : This constant holds the name of the test_company database.

      • DB_USER : This variable holds the test_user username.

      • DB_PASSWORD : This constant stores the MySQL PASSWORD of the test_user account.

      • DB_HOST: This represents the server where the database resides. In this case, you are using the localhost server.

      The following line in your file initiates a PHP Data Object (PDO) and connects to the MySQL database:

      ...
      $pdo = new PDO("mysql:host=" . DB_HOST . "; dbname=" . DB_NAME, DB_USER, DB_PASSWORD);
      ...
      

      Toward the end of the file, you’ve set a couple of PDO attributes:

      • ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION: This attribute instructs PDO to throw an exception that can be logged for debugging purposes.
      • ATTR_EMULATE_PREPARES, false: This option increases security by telling the MySQL database engine to do the prepare instead of PDO.

      You’ll include the /var/www/html/config.php file in two PHP scripts that you will create next for inserting and retrieving records respectively.

      First, create the /var/www/html/insert_products.php PHP script for inserting records to the products table:

      • sudo nano /var/www/html/insert_products.php

      Then, add the following information into the /var/www/html/insert_products.php file:

      /var/www/html/insert_products.php

      <?php
      
      require_once 'config.php';
      
      $products = [];
      
      $products[] = [
                    'product_name' => 'VIRTUAL SERVERS',
                    'price' => 5,
                    'product_image' => file_get_contents("https://i.imgur.com/VEIKbp0.png")
                    ];
      
      $products[] = [
                    'product_name' => 'MANAGED KUBERNETES',
                    'price' => 30,
                    'product_image' => file_get_contents("https://i.imgur.com/cCc9Gw9.png")
                    ];
      
      $products[] = [
                    'product_name' => 'MySQL DATABASES',
                    'price' => 15,
                    'product_image' => file_get_contents("https://i.imgur.com/UYcHkKD.png" )
                    ];
      
      $sql = "INSERT INTO products(product_name, price, product_image) VALUES (:product_name, :price, :product_image)";
      
      foreach ($products as $product) {
          $stmt = $pdo->prepare($sql);
          $stmt->execute($product);
      }
      
      echo "Records inserted successfully";
      

      Save and close the file.

      In the file, you’ve included the config.php file at the top. This is the first file you created for defining the database variables and connecting to the database. The file also initiates a PDO object and stores it in a $pdo variable.

      Next, you’ve created an array of the products’ data to be inserted into the database. Apart from the product_name and price, which are prepared as strings and numeric values respectively, the script uses PHP’s in-built file_get_contents function to read images from an external source and pass them as strings to the product_image column.

      Next, you have prepared an SQL statement and used the PHP foreach{...} statement to insert each product into the database.

      To execute the /var/www/html/insert_products.php file, run it in your browser window using the following URL. Remember to replace your-server-IP with the public IP address of your server:

      http://your-server-IP/insert_products.php
      

      After executing the file, you’ll see a success message in your browser confirming records were inserted into the database.

      A success message showing that records were inserted to database

      You have successfully inserted three records containing product images into the products table. In the next step, you’ll create a PHP script for retrieving these records and displaying them in your browser.

      Step 3 — Displaying Products’ Information From the MySQL Database

      With the products’ information and images in the database, you’re now going to code another PHP script that queries and displays the products’ information in an HTML table on your browser.

      To create the file, type the following:

      • sudo nano /var/www/html/display_products.php

      Then, enter the following information into the file:

      /var/www/html/display_products.php

      <html>
        <title>Using BLOB and MySQL</title>
        <body>
      
        <?php
      
        require_once 'config.php';
      
        $sql = "SELECT * FROM products";
        $stmt = $pdo->prepare($sql);
        $stmt->execute();
        ?>
      
        <table border = '1' align = 'center'> <caption>Products Database</caption>
          <tr>
            <th>Product Id</th>
            <th>Product Name</th>
            <th>Price</th>
            <th>Product Image</th>
          </tr>
      
        <?php
        while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
            echo '<tr>';
            echo '<td>' . $row['product_id'] . '</td>';
            echo '<td>' . $row['product_name'] . '</td>';
            echo '<td>' . $row['price'] . '</td>';
            echo '<td>' .
            '<img src = "data:image/png;base64,' . base64_encode($row['product_image']) . '" width = "50px" height = "50px"/>'
            . '</td>';
            echo '</tr>';
        }
        ?>
      
        </table>
        </body>
      </html>
      

      Save the changes to the file and close it.

      Here you’ve again included the config.php file in order to connect to the database. Then, you have prepared and executed an SQL statement using PDO to retrieve all items from the products table using the SELECT * FROM products command.

      Next, you have created an HTML table and populated it with the products’ data using the PHP while() {...} statement. The line $row = $stmt->fetch(PDO::FETCH_ASSOC) queries the database and stores the result in the $row variable as a multi-dimensional array, which you have then displayed in an HTML table column using the $row['column_name'] syntax.

      The images from the product_image column are enclosed inside the <img src = ""> tags. You’ve used the width and height attributes to resize the images to a smaller size that can fit in the HTML table column.

      In order to convert the data held by the BLOB data type back to images, you’ve used the in-built PHP base64_encode function and the following syntax for the Data URI scheme:

      data:media_type;base64, base_64_encoded_data
      

      In this case, the image/png is the media_type and the Base64 encoded string from the product_image column is the base_64_encoded_data.

      Next, execute the display_products.php file in a web browser by typing the following address:

      http://your-server-IP/display_products.php
      

      After running the display_products.php file in your browser, you will see an HTML table with a list of products and associated images.

      List of products from MySQL database

      This confirms that the PHP script for retrieving images from MySQL is working as expected.

      Conclusion

      In this guide, you used the MySQL BLOB data type to store and display images with PHP on Ubuntu 18.04. You’ve also seen the basic advantages of storing images in a database as opposed to storing them in a file system. These include portability, security, and ease of backup. If you are building an application such as a students’ portal or employees’ database that requires information and related images to be stored together, then this technology can be of great use to you.

      For more information about the supported data types in MySQL follow the MySQL Data Types guide. If you’re interested in further content relating to MySQL and PHP, check out the following tutorials:



      Source link

      The Flagship Series: Montreal Data Center Market Overview


      Montreal is gaining global prominence as a data center hotspot. While the market is still smaller than that of its Canadian-counterpart, Toronto, professional organizations are taking note of the opportunities to be had in this Quebec-based market.

      Named the “best location in the world to set up a data center” by the Datacloud world congress, Montreal won the Excellence in Data Centres Award in June 2019. And datacenterHawk notes that they believe Montreal will soon overtake the Toronto market as the last two years have seen rapid expansion. The major industries using this data center space in Montreal include technology, pharmaceuticals, manufacturing, tourism and transportation

      Roberto Montesi, INAP’s Vice President of International Sales, said that the data center market is growing fast in Montreal because of the low cost on power and land taxes. “The colder temperatures also permit us to run free cooling up to 10 months a year,” he said, noting that Montreal and Canada have a great relationship with the U.S. “It’s an easy extension for any American business to come up to Montreal and have access to so much great talent in our industry.”

      Considering Montreal for a colocation, network or cloud solution? There are several reasons why we’re confident you’ll call INAP your future partner in this competitive market.

      INAP’s Mark on Montreal

      INAP maintains four data centers and POPs in Montreal, including three flagship facilities. INAP’s Saint-Léonard (5945 Couture Blvd) and LaSalle (7207 Newman Blvd) flagships are cloud hosting facilities operated by iWeb, an INAP company.

      INAP’s Nuns’ Island flagship at 20 Place du Commerce offers high-density environments for colocation customers. The Nuns’ Island facility is a bunker-rated building and functions on a priority 1 hydro grid, the same as hospitals in the area. In addition to these features, our expert support technicians are dedicated to keeping your infrastructure online, secure and always operating at peak efficiency.

      Each INAP Montreal data center features sustainable, green design with state-of-the-art cooling and electricity that’s 99.9 percent generated from renewable sources. Customers are able to connect seamlessly with other major North American cities via our reliable, high-performing backbone to Boston and Chicago.

      Metro-wide, our Montreal data centers feature:

      • Power: 20 MW of power capacity, 20+ kW per cabinet
      • Space: Over 45,000 square feet of raised floor
      • Facilities: Designed with Tier 3 compliant attributes, located outside of flood plain and seismic zones
      • Energy Efficient Cooling: 2,500 tons of cooling capacity, N+1 with concurrent maintainability
      • Security: 24/7/365 onsite personnel, video surveillance, key card and secondary biometric authentication
      • Compliance: PCI DSS and SOC 2 Type II

      Download the Montreal Data Center spec sheet here [PDF].

      CHAT NOW

      Gain an Edge with INAP’s Connectivity Solutions

      INAP’s connectivity solutions and global network can give customers the boost they need to outpace their competition. And Montreal is the perfect place to take advantage of INAP’s connectivity solutions. “We also have fiber rich density coming up from Ashburn and Europe. This makes us a great location for customers looking for Edge locations,” said Montesi.

      By joining us in our Montreal data centers, customers gain access to INAP’s global network. Our high-capacity network backbone and one-of-a-kind, latency-killing Performance IP® solution is available to all customers. This proprietary technology automatically puts outbound traffic on the best-performing route. Once you’re plugged into the INAP network, you don’t have to do anything to see the difference. Learn more about Performance IP® by reading up on the demo and trying it out yourself with a destination test.

      INAP Interchange for Colocation

      Considering Montreal for colocation, but not sure where the future will lead? With INAP’s global footprint, which includes more than 600,000 square feet of leasable data center space, customers have access to data centers woven together by our high-performance network backbone and route optimization engine, ensuring the environment can connect everywhere, faster.

      With INAP Interchange, a spend portability program available to new colocation or cloud customers, you can switch infrastructure solutions—dollar for dollar—part-way through a contract. This helps avoid environment lock-in and achieve current IT infrastructure goals while providing the flexibility to adapt for whatever comes next.

      INAP Colocation, Bare Metal and Private Cloud solutions are eligible for the Interchange program. Chat with us to learn more about these services, and how spend portability can benefit these infrastructure solutions.

      Laura Vietmeyer


      READ MORE



      Source link

      IT Pros Predict What Infrastructure and Data Centers Will Look Like by 2025


      Predicting the future of tech is astonishingly hard. Human foresight is derailed by a host of cognitive biases that lead us to overreact to an exciting development or completely miss what in hindsight seems obvious (like these tech titans who scoffed at the iPhone’s introduction).

      We still love to try our hands at prognosticating though—especially on trends and issues that hit close to home.

      That’s why INAP surveyed 500 IT leaders and infrastructure managers about the near-term future of their profession and the industry landscape. Participants were asked to agree or disagree with the likelihood of eight predictions becoming reality by 2025.

      The representative survey was conducted in U.S. and Canada among businesses with more than 100 employees and has a margin of error of +/- 5 percent.

      Check out the results below, as well as some color commentary from INAP’s data center, cloud and network experts.

      By 2025, due to the advancements of AI and machine learning, most common data center and network tasks will be completely automated.

      Prediction 1

      The case for agree.

      “For the most basic of tasks, technology advancements like AI, Machine Learning, workflow management and others are quickly rising to a place of ‘hands off’ for those currently managing these tasks,” said TJ Waldorf, CMO and Head of Inside Sales and Customer Success at INAP. “In five years, we’ll see the pace of these advancements increase and the value seen by IT leaders also increase.”

      The case for disagree.

      It’s difficult to say how many of data center and network advancements will truly be driven by artificial intelligence and machine learning as opposed to already-proven software defined automation. The market for technologies like AIOPs (Artificial Intelligence for IT Operations) is still nascent, despite rising interest. Regardless of their source, automation developments will benefit infrastructure and operations professionals, according to Waldorf.

      “The reality is that these developments will give them time back to spend more time on business driving, revenue accelerating tasks,” he said. “The more we safely automate, the less risk of human caused errors which top the list of problems in the data center.”

      By 2025, on-premise data centers will be virtually non-existent.

      Prediction 2

      The case for agree.

      Workloads are leaving on-prem data centers for cloud and colocation at an incredible rate. In a study published late last year, we found that infrastructure managers anticipate a 38 percent reduction in on-premise workloads by 2022, driven by a need for greater network performance, application scalability and data center resiliency.

      The case for disagree.

      Interestingly, 48 percent of non-senior infrastructure managers surveyed disagree with the prediction.  INAP’s Josh Williams, Vice President of Channel and Solutions Engineering, thinks this group will likely prove right, despite the current migration trends.

      “Virtually non-existent is a bit of an overstatement,” said Williams. “The majority of workloads are still on prem today and it’s unlikely ‘virtually’ all of them will make it out for a variety of reasons. However, the trend is unmistakable: IT practitioners are abandoning data center management in huge numbers to help their applications perform and scale and allow them to focus on more than just keeping the lights on.”

      By 2025, most applications will be deployed using “serverless” models.

      Prediction 3

      The case for agree.

      The introduction of AWS Lambda in 2014 made waves for its promise of deploying apps without any consideration to resource provisioning or server management. Adoption for serverless is growing, and as Microsoft and Google continue to develop their Amazon alternatives, we can expect more even organizations to test it out.

      The case for disagree.

      Notably, 43 percent of non-senior infrastructure managers disagree with this statement. INAP’s Jennifer Curry, Senior Vice President of Global Cloud Services, agrees with them:

      “Serverless models have compelling use cases for ‘born in the cloud’ apps that have sporadic resource usage,” she said. “The tech, however, has a very long way to go before it’s the environment best suited for most workloads. The economics and performance calculus will favor other IaaS models for the foreseeable future, specifically for steady-state workloads and applications that require visibility for security and compliance.”

      Curry also notes that serverless is still a new and somewhat nebulous term that’s often misused as a synonym for any cloud or IaaS service, which could be skewing the optimism. Most public cloud usage still involves compute and storage services that require time-intensive, hands on monitoring and resource management.

      By 2025, virtually all companies will have a multicloud presence.

      Prediction 4

      The case for agree.

      “We’re already in a multicloud world,” said Curry, noting surveys that suggest wide-spread adoption at the enterprise level. “The more interesting question to me is: How many enterprises have a coherent multicloud strategy? Deploying in multiple environments is easy. Adopting a management and monitoring apparatus that mitigates vulnerabilities, ensures peak performance, and optimizes costs across infrastructure platforms is a challenge many enterprises struggle with.”

      The case for disagree.

      Outside of small businesses (who were not polled in this survey), INAP experts didn’t see a much of case for ‘disagree’ here. A certain percentage of businesses may attempt to achieve efficiencies by going all in on a single platform, but issues with lock-in and performance will likely deter that. Add SaaS platforms to your definition of multicloud, which our experts believe you should, and it’s hard to see anything but a multicloud world by 2025.

      By 2025, due to increasing demands for hyper-low latency service, most enterprises will adopt “edge” networking strategies.

      Prediction 5

      The case for agree.

      “Depending on your definition of edge networking, this prediction is already on its way to being true,” said Williams. “An edge networking strategy is about reaching customers and end-users as quickly as possible. Whether it is achieved through geographically distributed cloud, CDN or network route optimizations, cutting latency will be a pre-requisite for the success of any mission-critical application.”

      The case for disagree.

      Waldorf echoes the notion that most companies will pursue latency-reduction in the coming years but suggests that just like the introduction of cloud in the mid-2000s, a full embrace of “edge” as an established concept may take longer.

      “Edge use cases are still evolving,” he said. “The idea has been around a lot longer, but in the context of today’s IT landscape it’s only recently become something more leaders are starting to research and think about why it matters to them.”

      By 2025, Chief Security Officers or Chief Information Security Officers will be considered the second most important role at most enterprises.

      Prediction 6

      The case for agree.

      Cybersecurity is among the most pressing challenges faced day in, day out, according to IT pros, and this is unlikely to change as attacks grow more intense and unpredictable. CSOs and CISOs are key to staying one step ahead of vulnerabilities and require the authority to make necessary investments.

      The case for disagree.

      “It makes sense IT pros would largely agree with this proposition, as security leaders, along with CIOs, will be responsible for managing extreme amounts of risk critical to revenue,” said Williams. “The issue with this prediction, however, is that the CSO’s role is typically only widely visible when things go very wrong. So it’s unlikely stakeholders internally or externally will view them second to the CEO, whether or not the distinction is deserved.”

      By 2025, IT and product development teams at most companies will be fully integrated.

      Prediction 7

      The case for agree.

      In a 2018 survey, nearly 90 percent of IT infrastructure managers said they want to take a leading role in their company’s digital transformation initiatives. And that makes perfect sense. The success of any digital product or service ultimately depends just as much on its infrastructure performance as its coding, design and marketing. Integrating infrastructure operations with product teams could accelerate that goal.

      The case for disagree.

      Curry thinks integration may be the wrong goal, and that IT can grow its influence within organizations and lead digital transformation through stronger partnerships.

      “IT teams will have more success focusing on alignment with product teams, as opposed to pursuing complex reorganizations,” said Curry. “Senior IT leadership will still need to make a strong case as to why they need to be at the table earlier rather than later in the product development lifecycle. We’re seeing many of our most successful customers achieve alignment, but it’s a process that can take time and patience.”

      By 2025, despite technological development, the IT function will essentially look the same as it did in 2020.

      Prediction 8

      The case for agree.

      New tools and platforms can be implemented without changing the overall function of IT—e.g., infrastructure deployments and application delivery, preventing downtime, supporting end-users, etc.

      The case for disagree.

      With the decline of on-premise data centers and the rise of multicloud and hybrid platforms, the function of IT will inevitably evolve. As IT pros spend less time on routine infrastructure upkeep and maintenance, more time can be allocated to projects that drive innovation and efficiency. In INAP’s recent State of IT Infrastructure Management report, we got a preview of how IT teams would spend that time.

      Ryan Hunt
      • Director of Content & Communications


      Ryan Hunt is the Director of Content & Communications. READ MORE



      Source link