Học phần này được thiết kế tỉ mỉ nhằm trau dồi sự hiểu biết sâu sắc về các khái niệm cơ bản của Điện toán đám mây, khám phá các phân khúc đám mây khác nhau, mô hình triển khai và nhu cầu vốn có về việc áp dụng các giải pháp Điện toán đám mây. Học sinh sẽ phát triển sự đánh giá cao về sự phức tạp liên quan đến việc quản lý kiến trúc dịch vụ đám mây và trau dồi nhận thức quan trọng về các dự án dựa trên nguyên tắc Điện toán Đám mây.
Render configuration
When I click "Login" in the render, it redirects me to the login page After that, when I log in using your GitHub account, my Render account automatically links or connects to my GitHub account This allows for seamless integration and authentication between my Render and GitHub accounts
Figure 1 Login page in render
Figure 2 Login by github account
Connecting Render to Git
After successfully creating my Render account, I go to my GitHub account settings, navigate to
"Installations," and there I can find my Render account I click to configure
Figure 3 Render account on github
At this point, I change the repository access to "All repositories" to grant Render permission to access all of my repositories
Figure 4 Repository access on github
PostgreSql configuration on Render
Then, I log in to Render using my GitHub account and click on "New" located on the top right side of the Render homepage From there, I select "PostgreSQL" as the option to create a new PostgreSQL database
What I need to do is simply enter the name of my database, and I can leave the other fields empty Render will automatically generate the necessary configurations for me As a student, I can select the free option to use the database for free on Render
Figure 6 Interface for creating database (1)
Figure 7 Interface for creating database (2)
Database for ATN
Creating database
First, I need to create a database, and I set its name as "atn." Then, I create five tables: "category" (to classify toy categories), "role" (to categorize admin, director, and users), "shop" (to manage toy shops), "toy" (representing ATN company products), and "users" (for regular users)
Figure 9 Tables in ATN database
categoryid: A unique identifier for each category
categoryname: The name of the toy category (e.g., "Action Figures," "Puzzles")
roleid: A unique identifier for each role
rolename: The name of the role (e.g., "Admin," "Director," "User")
shopid: A unique identifier for each shop user
username: The username of the shop user
password: The password for the shop user
toyid: A unique identifier for each toy
toyname: The name of the toy
price: The price of the toy
origin: The origin or place of manufacture
description: A description of the toy
category_id: A reference to the category of the toy, linking to the "category" table
image: Possibly a field to store an image of the toy
shop_id: A reference to the shop managing or selling the toy, linking to the "shop" table
userid: A unique identifier for each user
username: The username of the user
password: The password for the user
role_id: A reference to the user's role, linking to the "role" table
Connecting database to postgreSql
After successfully creating the database on Render, I'll need to obtain the connection information for my PostgreSQL database This information typically includes:
Host Name: The address of the PostgreSQL server where my database is hosted
Port: The port number on which the PostgreSQL server is listening for connections
Username: The username I'll use to access the database
Figure 15 Database information on render
I'll use this information to set up a connection to my PostgreSQL database on my local development environment, allowing me to interact with the database and develop my application using the data stored in my Render-hosted database
Figure 16 Database information on local
Code implementation
UseCase diagram
In my system for ATN Company
Regular users can view the list of toys
Directors can view the list of toys and set the time interval for page refresh
Shop managers can view the list of toys, update the toy information, add new toys to the inventory, and delete toys from the inventory
Admin users can view the list of all users and shoppers in the system
Figure 17 UseCase diagram for ATN
Project structure
In my code, I've used the following technologies and structure:
I've employed the Express.js framework for building my application
I've used Tailwind CSS to create an aesthetically pleasing user interface (UI)
I've incorporated common packages such as Nodemon for auto-reloading during development, Session for managing user sessions, and pg (presumably for PostgreSQL) for database interaction
Inside the "helpers" folder, I've organized JavaScript files to perform various external tasks that support my application
The "models" directory contains files responsible for interacting with the database, handling data storage, and retrieval
The "public" folder houses CSS, JavaScript, and image files that are publicly accessible to enhance the appearance and functionality of my application
The "views" directory contains the pages and user interface components used in my application
Additionally, I have other external configuration files that are essential for my project's setup and configuration
This structure and the technologies I've chosen provide a foundation for developing a web application that includes a user-friendly UI, database interaction, and various functionalities
Figure 18 File structure in project
Home page
Below is the home page of the application, which includes a search bar with filter options and a list of toys Each toy in the list displays its name, description, image, price, origin, and category:
Search Bar: Users can use the search bar to input keywords for searching specific toys
Filter Options: The filter options allow users to refine their search based on specific criteria
List of Toys: Each toy item in the list typically includes the following details:
Toy Name: The name or title of the toy
Description: A brief description or information about the toy
Image: An image representing the toy to provide visual information
Price: The cost or price of the toy
Origin: The place or country where the toy is produced
Category: The category or type to which the toy belongs, helping users classify and identify the toy easily
This home page provides a user-friendly interface for searching and browsing toys based on various criteria, making it easier for users to find the toys that match their preferences and needs
Figure 19 Interface of home page
When the page runs, it triggers the load_data function, which initiates a request to a specific route in your application This route, in turn, communicates with the database to fetch data related to toys Once the requested data is retrieved from the database, it is then returned to the page, allowing the application to display all the toys on the home page
$(document).ready(function(){ let categoryId; load_datắ'); function load_data(filter, category) {
$.ajax({ url: '/', method: "POST", data: { action: 'fetch', filter: filter, category: category}, dataType: "JSON", success: function (data)
{ var html = ''; if(data.data.length > 0)
{ for(var count = 0; count < data.data.length; count++)
`+ data.data[count].toyname +`
`+ data.data[count].description +`
Price:$`+ data.data[count].price +`
Origin:`+ data.data[count].origin +`
`+ data.data[count].categoryname +`
Shop page
If a salesperson wants to access the shop page, they will need to log in, if they login fail, they will receive a error message Upon successful login, they will be directed to the shop page after a 2-second delay
This login and redirection process enhances the security of the shop page and ensures that only authorized salesmans can access it
Figure 22 Error message when login fail
Figure 23 Success message when login successfully
The shop page is designed for salesmen and provides the following functionalities:
List of Toys: The page displays a list of toys, allowing salesperson to view the available products
Add Toy: salesperson can add new toys to the inventory, expanding the range of products offered
Update Toy: The ability to update specific toy information enables salesperson to make changes to existing product details
Delete Toy: salesperson can delete specific toys from the inventory when needed
The page is tailored for salesperson, and upon their successful login, their unique salesperson ID is saved as a session variable This salesperson ID is used to retrieve and manage the toys they are responsible for in the database This approach ensures that salesperson can only add, update, and delete the toys they are assigned to oversee, enhancing data security and management in the shop
Figure 25 Interface of shop page
The result of updated toy has been shown on the home, as below result
Figure 27 Result when updating successfully
When I type completely information in form of adding a new toy, then the new toy has been saved
Figure 28 Interface when adding a new toy
Figure 29 Result when adding successfully
Admin page
Upon successfully logging in with an admin account, the system will send a message to inform the user that they have administrative privileges This message serves as a notification that the user has access to admin-specific features and functionalities within the system
In the admin page, the administrator has the authority to manage both users and shops within the system
Director page
Similarly, when a user logs in using a director account, the system will send a message to inform them that they have director-level privileges This message serves as a notification that the user is recognized as a director and is granted access to director-specific features and functions within the system
If a director chooses a specific shop, they will be able to view all the toys managed by the shop they've selected This feature allows directors to access and review the inventory of a particular shop, providing them with insights into the products managed by that specific shop
Figure 35 Interface of selecting shop
In my system, if a director selects a specific time for page reloading, the page will automatically refresh after the time interval they have set I've provided three time choices for reloading options: 5 minutes, 30 minutes, and 1 hour Depending on their selection, the page will refresh at the specified intervals, ensuring that the information on the page remains up to date
Figure 36 Interface of refreshTime selection
Upon page reload, the getRefreshTime() function is invoked, which retrieves the session data for refreshTime from the backend If the user hasn't explicitly chosen a refresh time, a default value of 5 minutes is set However, if the user selects a specific refresh time, the session is updated accordingly Subsequently, the page will automatically refresh at the specified time intervals, as defined by the user's choice or the default 5-minute interval if no selection was made
// When the page is loaded, get refreshTime to not remove current slelected time getRefreshTime();
// When not select any, default time is 5 minutes let refreshTime = 300000;
// function to get the chosen refresh time function getRefreshTime() {
$.ajax({ url: '/director/refresh', method: "GET", success: function (data) { if (data.data) { refreshTime = data.data;
} setTimeout(function () { window.location.reload(); console.log("Page is loaded");
// Send chosen freshtime to save in db (avoid reloading page will lose it) $('.dropdown-refresh-item').on('click', function () { var dataId = $(this).find('button').datắid');
$.ajax({ url: '/director/refresh', method: "POST", data: { timeSet: dataId }, success: function (data)
// reload to set refreshTime window.location.reload();
Figure 37 Code backend of setting refreshTime
Commiting code to Git
Creating repository
Once I've set up a GitHub account, the next step is to create a new repository with a valid name I set its type is public in order to allow my lecturer to view my project
Figure 38 Step 1 in creating repository
After that, I can simply copy the HTTPS link for my repository and begin uploading my code to GitHub
Figure 39 Step 2 in creating repository
Uploading code on github
This command initializes a new Git repository in my project directory It sets up the necessary infrastructure for version control
This command creates a new branch called "develop" and switches to it Branches allow me to work on different features or versions of my project independently
Figure 40 Step 1 and 2 in pushing code
The first command, git add fileName, stages a specific file for the upcoming commit, indicating that I want to include changes made to that file in the next commit
The second command, git add , stages all changes in the project directory, preparing them for the next commit
Step 4: git commit -m "upload code"
This command creates a commit that saves the changes I’ve staged The -m flag is used to provide a commit message that describes the changes I’ve made
Figure 41 Step 3 and 4 in pushing code
Step 6: git remote add origin https://github.com/Bminh1709/ATNStore.git (copy from repository)
This command establishes a connection between my local repository and the remote repository on GitHub I specify the remote repository's URL as "origin."
Step 7: git push origin develop
This command pushes your local "develop" branch to the remote repository named "origin" on GitHub It effectively uploads your code changes to the GitHub repository, making them accessible to others
Figure 42 Step 6 and 7 in pushing code
Indeed, once my code is successfully uploaded to GitHub, I'll be able to view all the files and changes I've pushed to my repository on the GitHub web page This makes it accessible and shareable with others who can review, collaborate, or download the code as needed
Deploying app to Render
To deploy my website on Render, follow these steps:
Start by selecting the "Web Service" option on Render to set up my web service
Choose the "Build and Deploy from Git Repository" option to get my code from my GitHub repository
Figure 45 Interface of web service deployment method
Connect the specific repository that I want to deploy
Figure 46 Interface of connecting specific repository
Provide the necessary information for my web page, which might include specifying the branch, build command, environment variables, and other relevant settings
Once I've filled in the required information, click "Finish" to initiate the deployment process
P7 CLOUD PLATFORM PROBLEM ANALYSIS AND SOLUTIONS
Limited Control and Flexibility
In a cloud computing environment, customers often find themselves with limited control and flexibility because the cloud infrastructure is owned, managed, and monitored by the cloud service provider (CSP) Depending on the specific service, cloud users may have reduced control over the functioning and operation of services within the cloud-hosted infrastructure The CSP's end-user license agreement (EULA) and management policies may impose restrictions on what customers can do with their deployments While customers maintain control of their applications, data, and services, they may not have the same level of control over their backend infrastructure
Utilize a Cloud Service Provider Partner: Consider partnering with a cloud service provider to assist in implementing, running, and supporting cloud services They can provide expertise and guidance in navigating the cloud environment
Understand the Shared Responsibility Model: Familiarize yourself with the responsibilities assigned to you as the customer and those of the cloud vendor in the shared responsibility model This understanding reduces the risk of omissions or errors in managing your cloud resources
Evaluate Support Levels: Take the time to comprehend the basic level of support offered by your cloud service provider Assess whether this level meets your support requirements Most CSPs offer additional support tiers at an extra cost, which may be necessary depending on your specific needs.
Internet connection
Connectivity issues can have a profound impact on businesses that rely heavily on online services The frequency and severity of these problems may vary depending on your geographic location
Inadequate attention to these issues can potentially lead to the loss of vital files and data, which can significantly disrupt operations
To safeguard against connectivity-related setbacks, it's crucial to establish a robust contingency plan While cloud services are generally reliable, they are not immune to limitations and service interruptions
Implementing a comprehensive backup and redundancy strategy ensures that any disruptions caused by internet outages can be effectively managed This approach enhances the resilience of your digital infrastructure and minimizes the risk of data loss during connectivity challenges.
Vendor Lock-In
Vendor lock-in is a concern often associated with cloud computing This phenomenon implies that transitioning between different cloud service providers can be challenging and not as seamless as one might hope This limitation arises from the fact that cloud services are not entirely standardized, and the variations between vendor platforms can introduce complexities and additional expenses when attempting to migrate services from one provider to another Poorly managed migrations may also inadvertently expose sensitive data to heightened security and privacy risks
Adhere to Cloud Architecture Best Practices: When designing your services, adhere to cloud architecture best practices These practices generally focus on enhancing availability, performance, decoupling layers, and mitigating performance bottlenecks By following these principles, you decrease the likelihood of encountering challenges when transitioning between different cloud platforms
Adopt a Multi-Cloud Strategy: To minimize the risks of vendor lock-in, consider employing a multi-cloud strategy While this approach may introduce additional complexity in terms of development and operations, it provides the flexibility to select services and technologies that best suit your requirements Adequate training can prepare your teams to make sound architectural and technological choices
Design for Flexibility: When developing applications, prioritize flexibility as a core design principle This approach ensures that your applications are portable both at present and in the future, reducing your dependence on a single vendor.
Downtime
Downtime is a significant challenge associated with cloud computing, given that cloud systems are internet-based and can experience service outages for various reasons The financial repercussions of service disruptions and outages can be substantial A recent survey conducted by the Uptime Institute revealed that approximately 31% of businesses have encountered IT service incidents or outages that had a significant impact on their operations within the past three years On average, an outage or slowdown can cost a business over $100,000 per hour Unfortunately, no organization is immune to such disruptions, especially when critical business processes cannot afford interruptions In 2023, several prominent companies and services, including IT Glue, Microsoft, Google Cloud, AWS, Oracle, and Datadog, experienced outages
Design for High Availability and Disaster Recovery: Develop your services with high availability and disaster recovery in mind Take advantage of the multi-availability zones provided by cloud vendors in your infrastructure
Consider Multi-Region Deployments: If your services have a low tolerance for failure, explore multi-region deployments with automated failover to ensure optimal business continuity
Implement a Robust Disaster Recovery Plan: Define and implement a disaster recovery plan aligned with your business objectives This plan should aim for the shortest possible recovery time (RTO) and recovery point objectives (RPO)
P8 ASSESS COMMON SECURITY ISSUES IN CLOUD ENVIRONMENTS
Overview
It's crucial for businesses to carefully consider the security threats associated with cloud computing alongside its many advantages Experts warn that companies that move to the cloud without a well- thought-out plan may expose themselves to potential issues in the future Furthermore, high-profile cloud security breaches can have a detrimental impact on a company's finances and reputation
Cloud service providers typically follow a shared responsibility model They are responsible for the security of the cloud infrastructure itself, while the security of the customer's data stored in the cloud is the customer's responsibility Regardless of the type of cloud service, such as infrastructure-as-a- service (IaaS) or software-as-a-service (SaaS), users are in charge of protecting their data from security risks and managing access to it
Most security threats in cloud computing are related to cloud data security Concerns often revolve around the data that users upload to the cloud, whether due to a lack of visibility into data, difficulty in data regulation, or data theft As businesses implement cloud technology, they need to consider various security issues in cloud computing and implement strategies to mitigate these risks effectively.
Security issues
Misconfiguration
One common problem in cloud computing is misconfiguring security settings, which can lead to data breaches Many businesses don't have effective security measures in place to protect their cloud- based infrastructure
Several factors contribute to this issue Cloud systems are designed to be user-friendly and facilitate easy data sharing, making it challenging to ensure that data is only accessible to authorized users Consequently, organizations using cloud infrastructure rely on the security provided by their cloud service provider (CSP) to establish and protect their cloud setups However, they also need complete visibility and control over their infrastructure
Due to the lack of experience in securing cloud infrastructure and the deployment of multiple clouds with different vendor-provided security controls, security lapses and misconfigurations can make an organization's cloud resources vulnerable to attacks.
Uauthorized access
One challenge with cloud-based installations is that they are not within the organization's network perimeter and can be accessed directly from the public internet While this accessibility is convenient for users and customers, it also makes it easier for unauthorized individuals to access a company's cloud services
If security configurations are not set up properly or if credentials are compromised, an attacker can potentially gain direct access to the organization's cloud-based services without the organization's knowledge This unauthorized access can lead to security breaches and data exposure, posing a significant risk to businesses using cloud infrastructure.
Data loss
One significant issue in cloud computing is the risk of data loss, often referred to as a data leak This happens when insiders, like employees and business partners, have access to sensitive information If the security of a cloud service is breached, it's possible for hackers to gain access to private or sensitive data
Companies that use cloud computing need to trust their cloud service provider (CSP) to handle some of their most critical data If the CSP experiences a breach or attack, the company can lose its data and intellectual property, and may be held liable for any resulting damages
According to a report from the international intelligence agency IDC, 79 percent of companies experienced at least one or two cloud data breaches within 18 months Data loss can occur due to various issues, including lost or damaged data, hardware problems, loss of access due to natural disasters, and malware attacks for which the CSP is unprepared This highlights the importance of data protection in cloud computing.
Insecure APIs
Application programming interfaces (APIs) allow customers to customize their experience with cloud services They handle authentication, access control, and encryption, making it possible for businesses to tailor cloud infrastructure services to their specific needs
However, the very nature of APIs can introduce security risks As the API infrastructure grows to provide more services, the potential for security threats increases APIs give developers the tools to create their own programs and integrate them with other important software For example, developers can use an API, like the one from YouTube, to include YouTube features in their websites or applications
The vulnerability of an API lies in the communication between different applications While this can be beneficial for organizations and developers, it also exposes them to potential security threats
Inadequate security in the way APIs communicate can lead to vulnerabilities in a company's cloud- based systems.
Hijacking of accounts
With the increased adoption of cloud services in many enterprises, account hijacking has become a significant problem Attackers can remotely access sensitive data stored in the cloud using stolen login credentials, which can belong to you or your employees They may also manipulate or falsify data using these hijacked credentials
Various techniques are employed to hijack accounts, such as exploiting scripting flaws and reusing passwords For example, Amazon experienced a cross-site scripting flaw in April 2010 that targeted customer credentials Threats can come from various sources, including phishing, keylogging, and buffer overflow attacks
One notable threat is the "Man in the Cloud Attack," where attackers steal tokens used by cloud services to validate individual devices without requiring constant logins for updates and synchronization This type of attack poses a significant risk to cloud-based account security
Retricted access to network operations
Transitioning from on-premises data storage to a cloud-based infrastructure can result in limited visibility into network operations, which is a notable drawback Businesses often grant varying degrees of control to Cloud Service Providers (CSPs) in exchange for benefits such as cost savings and scalable on-demand storage However, a significant security concern associated with cloud computing is the lack of visibility
The level of control CSPs have and the data security responsibilities of enterprises depend on the service model Nevertheless, the ongoing threat is the lack of insight into cloud environments, which can be problematic for companies relying on them for managing critical data, regardless of the shared responsibility model This limited visibility can make it challenging to monitor and secure network operations effectively
Figure 49 Cloud computing protection illustration
M3 DISCUSS PROBLEMS AND LIMITATIONS IN THE DEVELOPMENT PROCESS
Cloud computing
Cloud computing involves the provision of a variety of computing services, such as servers, storage, software, analytics, databases, networking, and intelligence, through the Internet (referred to as "the cloud") This approach offers flexible resources, facilitates quicker innovation, and provides cost- effective solutions
Cloud services assist organizations in optimizing their infrastructure, reducing operational expenses, and adjusting their capacity as required In the aftermath of the COVID-19 pandemic, businesses worldwide have boosted their expenditures on cloud technology solutions.
Top challenges in cloud computing
The cloud is a valuable resource with numerous advantages, but it also comes with a set of risks and challenges This section will delve into some of the prevalent challenges encountered in the field of cloud computing, examining issues related to cloud security, risks, and addressing common problems and their solutions in cloud computing
Figure 50 Cloud computing challenges illustration
Issues in development process
Data security and privacy
Protecting data security and privacy is of paramount importance when dealing with cloud environments Users bear the responsibility for safeguarding their data, as not all cloud providers can guarantee 100% data privacy
Common factors contributing to privacy breaches in the cloud include the absence of identity access management, insufficient visibility and control tools, data misuse, and misconfiguration of cloud settings Other concerns involve the threat of malicious insiders, insecure APIs, and lapses in managing cloud data.
Cost management
Even though Cloud Service Providers (CSPs) offer a pay-as-you-go subscription model for their services, organizations may still incur hidden costs due to underutilized resources, which can lead to budget overruns.
Lack of expertise
The cloud computing industry is highly competitive, and many professionals lack the necessary knowledge and skills required for employment in this field Additionally, there's a significant gap between the supply of certified individuals and the demand for such skills, leading to numerous job vacancies.
Control or governance
Effective IT governance ensures the proper use of tools and the implementation of assets in accordance with established procedures and policies However, a lack of governance is a prevalent issue in cloud computing, with companies often using tools that do not align with their objectives This lack of governance can lead to a lack of control over compliance, data quality checks, and risk management during the migration to the cloud from traditional infrastructure.
Compliance
Ensuring strong data compliance policies is essential, but Cloud Service Providers (CSPs) often fall short in this regard Organizations frequently encounter compliance challenges when transferring data from internal servers to the cloud, as CSPs may not be fully up-to-date with state laws and regulations.
Multi Cloud environments
Multi-cloud environments introduce a set of issues and challenges, including configuration errors, data governance concerns, a lack of security patches, and limited granularity It can be challenging to enforce consistent data management policies across diverse cloud platforms while also tracking security requirements across multiple clouds.
Performance challenges
Performance and security in cloud computing solutions are heavily reliant on the capabilities of the vendors It's important to note that if a cloud vendor experiences downtime, your data may be at risk.
Migration
Moving data to the cloud can be time-consuming, and not all organizations are ready for it Some encounter more downtime, security concerns, or difficulties with data formatting and conversions during this process Cloud migration projects can also become more expensive and challenging than expected
M4 ADDRESS SECURITY ISSUES WHEN BUILDING SECURE CLOUD PLATFORM
Big questions for cloud security
Is cloud computing secure
A common question that arises regarding cloud computing is its level of security The security aspect of cloud computing is a significant concern for many organizations that are contemplating the use of cloud-based applications and infrastructure
The answer to this question is not straightforward
Individual cloud computing solutions have the potential to be exceptionally secure, incorporating the latest security measures In fact, cloud service providers often allocate substantial resources to enhance the security of their data centers, surpassing what many other organizations can do, as facilitating access to data center infrastructure and applications is central to their business model
Nonetheless, the responsibility for a significant portion of cloud security often falls on the users of cloud computing services Failing to adequately address security risks and issues associated with cloud computing can result in preventable data breaches According to McAfee's data, "By 2023, at least 99 percent of cloud security failures will be attributable to the customer."
In essence, cloud security varies greatly, depending on how effectively the end user manages and mitigates cloud computing security challenges and risks In this regard, it is not significantly different from operating on-premises data centers, except for the fact that the cloud data center is located offsite.
Why is cloud security important
Why is cloud security such a crucial aspect of modern business operations? Cloud security holds the same level of importance as safeguarding your organization's internal network, and the reasons behind its significance are profound The security of your cloud computing environment matters because it directly affects your reputation, the integrity of your organization, and your ability to function effectively
However, cloud security is often underestimated or overlooked This misconception often arises from the assumption that cloud service providers (CSPs) bear the primary responsibility for cloud cybersecurity Unfortunately, this is not the typical scenario In most cases, it is the users of cloud computing solutions who carry the responsibility for any data breaches that may occur
But why are cloud users responsible for data security in cloud computing? This responsibility stems from the fact that, even when CSPs implement robust security controls to prevent, detect, and respond to breaches, the ultimate configuration of the cloud security strategy falls on the end user If the end user fails to configure their cloud security effectively or, worse, actively works around the CSP's security measures, the CSP has limited control in preventing a breach
While the extent of responsibility for strong cloud security can vary among different cloud services, it remains a critical concern across the board.
How to overcome these security issues
Limit the cloud computing vendors
One of the significant hurdles when dealing with cloud-based solutions is the diversity of security tools and processes offered by different providers, making it challenging to manage In this regard, limiting the number of cloud service providers (CSPs) you work with can significantly alleviate this issue
Whenever possible, consider consolidating your cloud solutions under a single CSP However, it's important to acknowledge that this might not always be feasible.
Confirm your access to cloud environment information
Visibility plays a crucial role in cybersecurity It's essential to ensure you understand the level of visibility you will have into your cloud environment, preferably before entering into any agreements Enhanced visibility allows you to more effectively monitor and control security measures.
Verify Security SLAs
Before finalizing an agreement with a cloud service provider, it's essential to thoroughly review the security-related components of their service level agreements (SLAs) Questions to consider include: How swiftly will the CSP respond to and resolve security breaches once detected? What's the expected timeframe for restoring normal service? And who bears the responsibility for notifying affected parties?
By verifying these SLAs before entering into an agreement, you can ensure that they align with the cybersecurity standards relevant to your industry, offer safeguards against extended service disruptions, and establish clear roles and responsibilities following a data breach.
Examine the specific security measures
It's crucial to assess how the CSP intends to protect your cloud environment against potential infiltrations by malicious actors and how they plan to contain the spread of attacks between different nodes on their network Examining the security measures a cloud service provider offers is pivotal for understanding their preparedness to safeguard your data, their ability to meet compliance standards, and the ease of integration within your existing cybersecurity architecture
Keep in mind that not all cloud solutions come with built-in security for the cloud computing environment Particularly with Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) solutions, the onus often falls on the customer to incorporate the necessary security systems for protecting the cloud environment.
Employ advanced firewall solutions
To mitigate a range of security challenges in the public cloud, organizations can implement advanced firewall solutions, such as next-generation firewalls (NGFWs) and web application firewalls (WAFs) NGFWs are equipped to identify and thwart advanced threats, including malware and application- layer attack vectors They also receive updates to adapt to evolving security threats, ensuring continuous protection against the latest cyberattacks
Concurrently, WAFs serve to safeguard cloud applications against potential exploits They can be customized with specific rules, permitting traffic exclusively from designated Internet Protocol (IP) addresses.
Implement data encryption
Data encryption plays a pivotal role in safeguarding cloud-based data, particularly sensitive information stored in or transmitted to and from the cloud By encrypting data, organizations ensure that their data remains secure both when at rest within data storage applications and during transmission between on-premises systems and the cloud Encryption guarantees that information cannot be intercepted or pilfered.
Seek advice from cybersecurity experts
When in doubt, it's advisable to seek assistance If you ever find yourself uncertain about whether a particular cloud solution includes adequate security measures to safeguard your organization's data, personnel, and clients, consulting with a cybersecurity expert can provide valuable insights and help you make more informed decisions to enhance your organization's long-term security
D2 CRITICALLY DISCUSS HOW ORGANIZATIONS OVERCOME ISSUES
Solutions
To address these issues and enhance data security and privacy:
Stay updated with the latest software patches and configurations to mitigate security vulnerabilities
Utilize antivirus and firewall solutions to bolster security measures
Consider increasing bandwidth to ensure data availability in the cloud
Implement cybersecurity solutions to protect against data security risks and threats
By following these measures, organizations can reduce the likelihood of data security breaches and maintain a higher level of privacy in their cloud environments
Understand that not all organizations have the same resources to invest in high-end security solutions Smaller businesses may need to balance security with budget constraints
To address this issue, organizations can:
Implement resource utilization monitoring tools to keep track of their cloud resource usage
Conduct regular audits to identify and rectify inefficient resource allocation
Utilize efficient cost management strategies to mitigate the risks of overspending
Smaller organizations may find it challenging to allocate resources for implementing complex monitoring solutions
To bridge this expertise gap, companies can:
Support their existing IT staff in advancing their careers by investing in cloud training programs
Provide opportunities for employees to acquire cloud-related certifications and skills
Foster a culture of continuous learning and development within the organization
Smaller businesses may struggle to allocate budget and time for training, and alternative approaches should be explored
To address this challenge, organizations can consider adopting traditional IT operations and practices to facilitate a smoother transition during cloud migrations By doing so, they can better manage and govern their cloud environment, ensuring alignment with their compliance, data quality, and risk management objectives
Large enterprises may face organizational resistance when implementing traditional governance practices in cloud environments
A potential solution to this problem lies in the General Data Protection Regulation Act, which is anticipated to address compliance issues in the future for CSPs This regulatory framework aims to provide a more robust and consistent approach to data protection and privacy, potentially alleviating compliance concerns in cloud computing
Some industries face more stringent compliance requirements than others, requiring tailored solutions and vigilant monitoring
To tackle these challenges, organizations can implement a multi-cloud data management solution However, it's crucial to carefully select the right solution, as not all tools offer the specific security functionalities needed for managing multi-cloud environments effectively Given the increasing complexity of multi-cloud setups, organizations should opt for solutions that can adapt to the evolving landscape of cloud technology
Small organizations may struggle to afford comprehensive multi-cloud solutions, necessitating a balance between features and budget
One way to address this issue is by ensuring that Cloud Service Providers (CSPs) have real-time
Software as a Service (SaaS) monitoring policies in place These policies can help in detecting and mitigating performance issues and downtime more effectively, reducing the risk of data loss and service disruptions
Organizations with limited budgets may struggle to invest in comprehensive real-time monitoring solutions, requiring a strategic approach to prioritize critical areas
To address these issues, organizations should consider hiring in-house experts to manage their cloud data migration and invest more in this process It's important to have experts analyze cloud computing challenges and solutions before committing to the latest platforms and services offered by Cloud Service Providers (CSPs) This approach can help make the migration smoother and more cost- effective
D3 CRITICALLY DISCUSS HOW ORGANIZATIONS PROTECT DATA WHEN
Cloud migration is a complex and crucial endeavour for businesses, and without a robust data security plan, it can leave your organization vulnerable to data breaches and cyberattacks Companies unaware of the risks can inadvertently expose their data during migration, especially when employing inadequate data security protocols Such lapses in security can make your databases susceptible to cyber threats throughout the entire migration process, offering hackers unrestricted access to valuable data
To safeguard your data and maintain a secure cloud migration, it's essential to maintain a high level of vigilance and adhere to the best practices mentioned below These practices are designed to enhance data security and protect your company's most valuable asset at every stage of the cloud migration process By prioritizing data security and following these guidelines, you can mitigate the risks associated with cloud migration and ensure the confidentiality and integrity of your data
In 2020, the United States alone reported more than 1,001 cases of data breaches In today's business landscape, data has become the most invaluable asset across all industries It is paramount to safeguard this asset rigorously, whether you are in the process of migrating to the cloud or maintaining your on-premises servers
Regrettably, during the cloud migration phase, there are instances when data becomes vulnerable and exposed Consider the case of Keepnet Lab's Data Breach, where a contractor turned off the firewall for a mere ten minutes while migrating to ElasticSearch This inadvertent action exposed the database to attackers who subsequently breached over 5 billion data records.
How to protect data in migrating to cloud
Evaluate current data landscape
In many cases, organizations may overlook the need to assess their existing data landscape for extended periods, sometimes even years Over time, data accumulates across multiple databases, and it's not uncommon to lack differentiations between critical and obsolete data stored for an extended period
Consequently, the first step in any migration process is to comprehensively assess your data This assessment aims to align your data holdings with established retention policies within your data governance framework Data migration, given its sensitivity, can introduce potential risks to your company's data if not executed meticulously
The migration phase offers the ideal opportunity to conduct these assessments, and cloud vendors often provide a range of assessment tools to facilitate this process For instance, AWS Schema
Conversion Tool (AWS SCT) can be employed to generate a comprehensive database migration assessment report This tool is specifically designed to assist in converting your existing database schema from one database engine to another.
Conprehend regulatory compliance framework
Compliance prerequisites exhibit considerable diversity across various industries, with sectors like Healthcare, Finance, and eCommerce subject to stringent regulatory oversight It is imperative to harmonize your data controls in strict accordance with the pertinent regulatory bodies that govern your business-critical activities A diligent review of these applicable regulations is essential to preempt penalties or fines resulting from compliance infringements, particularly during the planned migration phase
Depending on your industry and the type of data you handle, there may be specific data transit regulations you must adhere to For instance, healthcare organizations are obligated to conform to HIPAA compliance regulations governing data transits in the United States Thus, understanding and adhering to the relevant compliance framework is paramount in ensuring a smooth and lawful migration process.
Select appropriate data security tools
Utilizing the right data security tools is imperative throughout the migration process It's essential to explore the resources provided by your chosen cloud vendor to identify the necessary tools that can facilitate a secure migration Additionally, regulatory requirements play a vital role in guiding the selection of data security tools needed during the migration
For instance, if your organization must adhere to HIPAA compliance, you can confidently leverage the AWS Database Migration Service (DMS), which is part of AWS's HIPAA compliance program With AWS DMS, you can securely transfer data between your HIPAA-compliant applications, including protected health information (PHI), under the framework of your executed Business Associate Agreement (BAA) with AWS
In the selection of data security tools, it's crucial to strike a balance Avoid unnecessary overspending on tools that might be excessive for the tasks at hand Multiple solutions are available to accomplish similar goals, so prioritize cost-effective solutions that align with your specific needs Always aim for a balanced approach that offers both security and efficiency.
Manage authorized personnel for data access
The principle of least privilege plays a crucial role in ensuring data security during migration processes Organizations must implement stringent access control measures throughout a cloud migration Access to critical data should be restricted to a select group of authorized personnel
To mitigate risks and maintain data security, only trusted individuals within the organization who possess a thorough understanding of security measures should be granted access Additionally, it is essential to enforce two-factor authentication (2FA) as an added layer of protection and identity verification
Upon the completion of the migration, access privileges can be reinstated safely while upholding all security protocols Limiting authorization during the migration phase is a vital step in preventing unauthorized users and malicious actors from gaining access to temporarily exposed data This practice reinforces data security and minimizes potential threats.
Encrypt data during transit
Maintaining data security during data transit is crucial to prevent vulnerabilities and security breaches Instances like disabling firewalls, as illustrated in the Keepnet Lab case, can expose data to significant risks, even for a brief period
To ensure the secure transfer of data, organizations should implement robust network security controls and employ encrypted network protocols These measures prevent unauthorized third parties from intercepting and compromising sensitive data during transit
In cloud environments, many cloud service providers offer concierge-style solutions designed to facilitate swift and secure data migration These solutions adhere to stringent security protocols, enhancing the overall security of data transit By leveraging such services, organizations can effectively safeguard their data during the migration process.
Plan decommissioning activities for remaining data center
Following the migration to the cloud, organizations often neglect their physical or on-premise infrastructure, leaving media and hardware assets unused To address this issue, it's imperative to have a comprehensive decommissioning plan in place for any retained hardware and media
This plan should outline the organization's intentions regarding the reuse or potential resale of existing hardware equipment The decommissioning plan typically encompasses several steps, including inventory assessment, planning, equipment removal, and data sanitization
In cases where physical storage is integrated as part of a hybrid cloud infrastructure, it's essential to implement a robust security plan that covers both on-premise and cloud architectures This comprehensive security strategy helps safeguard data across all environments, providing end-to-end protection
Figure 1 Login page in render 6
Figure 2 Login by github account 6
Figure 3 Render account on github 7
Figure 4 Repository access on github 7
Figure 6 Interface for creating database (1) 8
Figure 7 Interface for creating database (2) 9
Figure 9 Tables in ATN database 10
Figure 15 Database information on render 13
Figure 16 Database information on local 13
Figure 17 UseCase diagram for ATN 14
Figure 18 File structure in project 15
Figure 19 Interface of home page 16
Figure 22 Error message when login fail 19
Figure 23 Success message when login successfully 19
Figure 25 Interface of shop page 20
Figure 27 Result when updating successfully 21
Figure 28 Interface when adding a new toy 21
Figure 29 Result when adding successfully 22
Figure 35 Interface of selecting shop 24
Figure 36 Interface of refreshTime selection 25
Figure 37 Code backend of setting refreshTime 26
Figure 38 Step 1 in creating repository 27
Figure 39 Step 2 in creating repository 27
Figure 40 Step 1 and 2 in pushing code 28
Figure 41 Step 3 and 4 in pushing code 28
Figure 42 Step 6 and 7 in pushing code 29
Figure 45 Interface of web service deployment method 30
Figure 46 Interface of connecting specific repository 31
Figure 49 Cloud computing protection illustration 37
Figure 50 Cloud computing challenges illustration 38