Lập kế hoạch tính toán (computing planning) cho một dự án liên quan đến việc xác định và tổ chức các nguồn tài nguyên tính toán cần thiết để thực hiện dự án một cách hiệu quả và đạt được các mục tiêu đề ra. Điều này bao gồm việc phân tích yêu cầu, lập lịch, phân bổ tài nguyên, và theo dõi tiến độ.
Trang 1ASSIGNMENT 2 FRONT SHEET
Qualification BTEC Level 5 HND Diploma in Computing
Unit number and title Unit 06: Planning a computing project
Submission date 7-4-2024 Date Received 1st submission
Student declaration
I certify that the assignment submission is entirely my own work and I fully understand the consequences of plagiarism I
understand that making a false declaration is a form of malpractice
Student’s signature Anh
Grading grid
Trang 2 Summative Feedback: Resubmission Feedback:
IV Signature:
Trang 3Table of Contents
A: Introduction 5
I: Project purpose: 6
II: The objectives of the project 7
P5: Devise comprehensive project plans for a chosen scenario, including a work and resource allocation breakdown using appropriate tools 8
1 Overview: 8
2 Project Scope and Deliverables: 8
3 Work Breakdown Structure (WBS): 9
4 Work timeline: 12
P6 Communicate appropriate project recommendations for technical and nontechnical audiences 16
1: Stakeholders 16
2: Project Recommendations for Technical Audience: 17
3: Project Recommendations for Non-Technical Audience 18
P7 Present arguments for the planning decisions made when developing the project plans 20
P8 Discuss accuracy and reliability of the different research methods applied 25
1: Primary Research 25
1.1: Qualitative Research: 25
1.2: Quantitative Research (Survey): 26
2: Primary Research: 27
C: Conclusion 28
D: Reference 29
Trang 4Table of Figures
Figure 1: My WBS 11
Figure 2: Gantt Chart 1 12
Figure 3: Gantt Chart 2 12
Figure 4: Gantt Chart 3 13
Figure 5: Gantt Chart 4 13
Figure 6: Gantt Chart 5 14
Figure 7: Gantt Chart 6 14
Trang 5A: Introduction
In recent years, the surge in digital technologies has catalyzed an unprecedented influx of data across various sectors, including academia This deluge of information, commonly referred to as Big Data, presents a wealth of opportunities for educational institutions seeking to enhance their
Moreover, the application of Big Data analytics holds immense potential in the realm of academic performance analysis By scrutinizing student performance data, institutions can discern trends indicative of success or struggle, thus enabling the implementation of personalized support
mechanisms and interventions Such a data-centric approach not only enriches the learning
experience but also cultivates improved student outcomes, fostering a culture of continuous
academic enhancement
However, despite the promise and potential of Big Data technologies, their integration into
academic settings is not devoid of challenges Privacy and security concerns surrounding student data demand meticulous attention to ensure compliance with pertinent regulations and safeguard sensitive information Additionally, issues pertaining to data quality and integration necessitate concerted efforts to harmonize diverse data sources for effective analysis Furthermore, the
successful implementation of Big Data technologies hinges upon the availability of skilled data analysts and robust IT infrastructure
Trang 6B: Contents
I: Project purpose:
The potential of applying cloud computing for processing large datasets stored on a cloud system is profound and transformative Cloud computing represents a paradigm shift in how organizations approach data processing, offering unparalleled scalability and flexibility By harnessing the power
of cloud resources, institutions can effectively manage vast amounts of data without the need for costly infrastructure investments This scalability enables organizations to adapt to fluctuating data volumes seamlessly, ensuring that processing capabilities align with evolving needs
One of the most compelling aspects of cloud computing is its cost-effectiveness Traditional data processing methods often require significant upfront investments in hardware and infrastructure
In contrast, cloud computing operates on a pay-as-you-go model, allowing organizations to scale resources up or down based on demand This not only reduces initial capital expenditures but also optimizes operational costs over time, making data processing more accessible and affordable Furthermore, cloud computing offers advanced tools and technologies for analyzing data From sophisticated analytics platforms to machine learning algorithms, cloud service providers equip researchers with the means to extract valuable insights from large datasets efficiently These
analytical capabilities empower organizations to make data-driven decisions, driving innovation and competitive advantage
Security is paramount in the realm of data processing, especially when dealing with sensitive
information Cloud service providers prioritize security measures, implementing robust encryption, access controls, and compliance frameworks to safeguard data integrity and confidentiality By entrusting data to reputable cloud platforms, organizations can mitigate security risks and ensure compliance with regulatory requirements
Trang 7II: The objectives of the project
The objectives of the project focused on harnessing the potential of cloud computing for processing large datasets stored on a cloud system are as follows:
Evaluate Cloud Computing Platforms: Conduct thorough research and evaluation of various cloud
computing platforms (e.g., AWS, Azure, GCP) to identify the most suitable platform for processing large datasets efficiently
Design Scalable Infrastructure: Design and deploy a scalable cloud infrastructure capable of
storing and processing large datasets This includes setting up virtual machines, storage solutions, networking components, and security measures
Develop Optimized Algorithms: Develop and optimize data processing algorithms or workflows
specifically tailored for cloud environments Utilize parallel processing techniques and distributed computing frameworks to maximize performance and efficiency
Implement Data Processing Workflows: Implement developed algorithms and workflows on the
selected cloud platform Ensure seamless integration with cloud services and optimize
configurations for efficient data processing
Document Best Practices: Document best practices, guidelines, and recommendations for
leveraging cloud computing for processing large datasets This includes infrastructure setup
procedures, algorithm design principles, cost management strategies, and security considerations
Facilitate Knowledge Transfer: Provide user manuals, guides, and training sessions to facilitate
knowledge transfer and enable stakeholders to leverage cloud computing effectively for large
dataset processing tasks
Deliver Comprehensive Reports: Compile research findings, infrastructure setup details,
algorithm implementations, testing results, and best practices into comprehensive reports Present key insights, recommendations, and lessons learned to stakeholders for informed decision-making
By accomplishing these objectives, the project aims to empower organizations with the knowledge, tools, and capabilities needed to harness the full potential of cloud computing for processing large datasets efficiently and effectively
Trang 8P5: Devise comprehensive project plans for a chosen scenario, including a work and resource allocation breakdown using appropriate tools
1 Overview:
The exponential growth of data in various domains, including but not limited to business, science, and technology, has led to an increased demand for efficient methods of processing and analyzing large datasets Traditional computing infrastructures often struggle to keep up with the scale and complexity of these datasets, resulting in performance bottlenecks and increased processing times Cloud computing, with its scalable and on-demand resources, offers a promising solution to this challenge By leveraging cloud-based infrastructure and services, organizations can efficiently store, process, and analyze large datasets without the need for significant upfront investments in
hardware and infrastructure
This project seeks to explore the potential of applying cloud computing technologies for processing large datasets stored on cloud systems By harnessing the scalability, flexibility, and cost-
effectiveness of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure,
or Google Cloud Platform (GCP), the project aims to develop optimized solutions for handling
massive datasets efficiently Through a combination of research, infrastructure setup, algorithm development, testing, and documentation, the project will provide insights and guidelines for
organizations looking to leverage cloud computing for large-scale data processing tasks
2 Project Scope and Deliverables:
Project Scope:
Research and Evaluation: Conduct comprehensive research on various cloud computing platforms and their suitability for large dataset processing Evaluate factors such as scalability, performance, cost, and ease of use
Infrastructure Setup: Design and deploy a scalable cloud infrastructure capable of storing and
processing large datasets Configure security measures, data backup mechanisms, and monitoring tools to ensure data integrity and availability
Algorithm Development: Develop and optimize data processing algorithms or workflows specifically designed for cloud environments Utilize parallel processing techniques, distributed computing frameworks (e.g., Apache Spark), and cloud-native services to maximize performance and efficiency Deployment and Testing: Deploy developed algorithms on the cloud infrastructure and conduct rigorous performance testing Evaluate factors such as processing speed, scalability, resource
utilization, and cost-effectiveness Identify and address any bottlenecks or performance issues
Trang 9Deliverables:
Research Report: A comprehensive report detailing the research findings on various cloud
computing platforms and their suitability for large dataset processing This report will include an analysis of key factors such as scalability, performance, cost, and ease of use
Infrastructure Setup Documentation: Detailed documentation outlining the setup procedures and configurations for the cloud infrastructure This documentation will cover aspects such as virtual machine provisioning, storage configuration, network setup, security measures, and monitoring tools
Algorithm Implementation: Developed data processing algorithms or workflows optimized for cloud environments This includes code repositories, implementation details, and integration with cloud services
Best Practices Documentation: Guidelines, best practices, and recommendations for leveraging cloud computing for large dataset processing This documentation will cover topics such as
algorithm design, infrastructure optimization, cost management, and security considerations
By delivering these comprehensive project scope and associated deliverables, the project aims to provide valuable insights and actionable recommendations for organizations seeking to harness the potential of cloud computing for processing large datasets effectively
3 Work Breakdown Structure (WBS):
WBS stands for Work Breakdown Structure It is a project management technique used to break down a project or work scope into smaller, more manageable components The purpose of creating
a WBS is to organize and define the work required to complete the project
The WBS is typically represented as a hierarchical structure, starting with the highest level, which represents the main deliverables or phases of the project Each subsequent level represents a
further breakdown of the deliverables into smaller and more specific components The lowest level
of the WBS consists of work packages or tasks that can be assigned to individuals or teams for
execution
The WBS provides a visual representation of the project's scope and helps in understanding the relationship between different components of the project It enables project managers to effectively plan, schedule, and control the project by identifying all necessary work and ensuring that it is assigned to the appropriate resources
Trang 10To create a WBS for your project, you’ll need information from other project management
documents
Here are six simple steps to create a work breakdown structure
1: Define Project Objectives and Scope
• Understand the overarching goal of the project, which in this case is to explore and leverage cloud computing for processing large datasets stored on a cloud system
• Define the scope of the project, including the specific tasks and deliverables
2: Identify Major Deliverables
• Determine the key deliverables of the project This may include research reports,
infrastructure setup, algorithm development, testing results, documentation, and training materials
3: Break Down Deliverables into Sub-Deliverables
• Decompose each major deliverable into smaller, manageable components For example, infrastructure setup may include tasks such as designing architecture, setting up virtual machines, and configuring security measures
4: Identify Work Packages
• Break down sub-deliverables into work packages, which are the lowest level of tasks in the WBS These are actionable items that can be assigned to team members and tracked
individually
5: Assign Responsibility and Resources
• Determine who will be responsible for each work package and allocate the necessary
resources, including personnel, time, and budget
6: Review and Validate
• Review the WBS with key stakeholders to ensure that all tasks are captured and properly organized
• Validate the WBS against project objectives, scope, and constraints to ensure completeness and accuracy
Trang 11Figure 1: My WBS
Trang 124 Work timeline:
I will use Gantt chart software to create a Work timeline for my project
Milestone 1: Research Initiation (Month 1)
Objective: Establish a clear direction for the research study and define its scope
Stakeholders: Research team, project manager, sponsors
Result: The research objectives and scope were clearly defined, existing literature was reviewed, and key research questions and hypotheses were identified, providing a solid foundation for the subsequent phases
Figure 2: Gantt Chart 1
Milestone 2: Data Collection Phase (Month 2)
Objective: Gather relevant data and resources necessary for the research
Stakeholders: Research team, data providers, project manager
Result: Data on cloud computing platforms and datasets for research were collected, along with identification of relevant case studies, enabling the research to proceed with adequate resources and examples
Figure 3: Gantt Chart 2
Trang 13Milestone 3: Literature Review and Analysis (Month 3)
Objective: Analyze existing research to identify trends, challenges, and opportunities in cloud-based data processing
Stakeholders: Research team, academic community, project manager
Result: Existing literature and case studies were thoroughly analyzed, providing insights into
current trends, challenges, and opportunities in cloud-based data processing, which guided the direction of the research
Figure 4: Gantt Chart 3
Milestone 4: Research Methodology Development (Month 4)
Objective: Design the methodology for the research study and establish criteria for evaluating based solutions
cloud-Stakeholders: Research team, project manager, evaluators
Result: A robust research methodology was developed, along with criteria for evaluating based solutions, ensuring the research study's credibility and effectiveness
cloud-Figure 5: Gantt Chart 4
Trang 14Milestone 5: Data Analysis and Interpretation (Month 6)
Objective: Analyze collected data and interpret findings in relation to research objectives
Stakeholders: Research team, data analysts, project manager
Result: Data from literature review and case studies were analyzed, findings were interpreted, and gaps in existing knowledge were identified, laying the groundwork for further investigation and analysis
Figure 6: Gantt Chart 5
Milestone 6: Report Writing and Documentation (Month 7)
Objective: Summarize research findings, document methodologies, and provide recommendations Stakeholders: Research team, project manager, sponsors, stakeholders
Result: Research findings were summarized in a comprehensive report, methodologies and analyses were documented, and insights and recommendations were provided, fulfilling the objectives of the research study
Figure 7: Gantt Chart 6