cc lab manual
cc lab manual
MODULE - I
Ex.No: 1a INSTALLING A C COMPILER IN THE VIRTUAL MACHINE AND
Date: EXECUTING A SAMPLE PROGRAM
AIM:
To find a procedure to install a C compiler in the virtual machine and execute a sample
program.
PROCEDURE:
Step 2: To check if GCC is already installed type the command gcc –version
After executing the above command if the system displays the message as shown above, then
it means GCC is already present
Step 3: In case if GCC is not present then execute the following command,
sudo apt install build-essential
The above command has to execute only if gcc is not present.
3
Step 4: Now let us create a C file in which we want to write our hello world program.
If you want to create a C file in desktop, set path accordingly.
To create file in ubuntu we can use touch command <file name>
4
Type the program in the window that appears and save the file by pressing Cltr+S and close
the file.
5
Step 8: Since it a binary file we can execute it. To execute it provide the command as ./test
7
RESULT:
Thus, the procedure to install a C compiler in the virtual machine and execute a
sample program was executed successfully.
8
Ex.No: 1b
VIRTUAL MACHINE MIGRATION
Date:
AIM:
To show the migration of virtual machine based on the certain condition from one
node to the other.
PROCEDURE:
Step1: Select the VM and click File->Export Appliance
Step7: Go to File->Computer:/home/sam/Documents/
RESULT:
MODULE - II
Ex. No: 2a
INSTALLING GOOGLE APP ENGINE & CREATING HELLO WORLD APP
Date:
AIM:
To find a procedure to install Google App Engine and creating Hello World & other simple
web applications using python.
PROCEDURE:
In the window that appears next click on the drop-down arrow mark near project name.
In the next window that appears select the option as New Project.
18
In the New Project tab that appears give project name and click create option.
Step 3: To view the list of projects available the selected project console type the command
as gcloud projects list.
Once this command is entered it will ask us to authorize cloud shell.
20
This command will list all the projects available in that particular console.
Step 4: Select APIs and services from the menu to enable Google App Engine.
21
22
In the welcome to API Library page, serach for google app engine in the search tab
Step 6: Now go to cloud shell & type command as gcloud config set project <project_id>
23
Now either You can click on Create Application or go to shell and type command gcloud
app create.
After entering the gcloud app create command it will display a list of location and ask us to
enter numeric value of that location.
25
Step 8: Now clone the Hello World sample app repository to our local machine.
26
Step 13: Now Select Billing menu & Click Link Billing account
29
Step 15: Once the billing id done. Copy paste the link that appears in the window next in a
new tab.
RESULT:
Thus, the procedure to install Google App Engine and creating Hello World & other simple
web applications using python was executed successfully.
30
Ex. No: 2b
LAUNCHING WEB APPLICATION USING GAE
Date:
AIM:
To find a procedure to install Google App Engine and creating Hello World & other simple
web applications using python.
PROCEDURE:
Step 2: Now double click on the setup file & procedure with the installation process.
32
33
In the guide page click on Google Cloud CLI installer & proceed with the installation process.
In the Google Cloud SDK setup page that appears next simply click on Next button
35
In the next window that appears select the option as Single user & click Next button.
36
37
Step 4: Now Create a folder named app in desktop and then create a another subfolder
named ae-01-trivial in app.
application: ae-01-trivial
version: 1
runtime: python
api_version: 1
handlers:
- url: /.*
script: index.py
and then save the file. While saving the file change the filetype as all files and give the
filename as app.yaml
Step 6: Open the notepad and type the following code in it.
and then save the file. While saving the file change the filetype as all files and give the
filename as index.py
Step 8: The Cloud SDK Shell will be opened in that type the command as google-cloud-
sdk\bin\dev_appserver.py “<app folder path>”
Step 9: Copy paste the URL in any browser and click enter
RESULT:
Thus, the procedure to install Google App Engine and creating Hello World & other
simple web applications using python was executed successfully.
39
MODULE - III
Ex. No: 3a
SIMULATE A CLOUD SCENARIO USING CLOUD SIM
Date:
AIM:
To simulate a cloud scenario using CloudSim and run a scheduling algorithm using CloudSim.
PROCEDURE:
Step1 : Now within Eclipse window navigate the menu: File -> New -> Project,
To open the new project wizard
Step2: select the ‘Java Project‘ option, once done click ‘Next‘
40
Step3: provide the project name and the path of CloudSim project source code as
Step4: Unselect the ‘Use default location’ option and then click on Browse' to open the path
41
where you have unzipped the Cloudsim project and finally click Next toset project settings.
Step5: You navigate the path till you can see the bin, docs, examples etc folder in thenavigation
plane.
42
Step 6: Once done , click ‘Next’ to go to the next step i.e. setting up of project settings Now
open‘Libraries’ tab in the list then simply click on ‘Add External Jar’(commons- math3-
3.x.jar will be included in the project from this step) .
Step7: Open the path where you have unzipped the commons-math binaries andselect
43
Step 8: Ensure external jar that you opened in the previous step is displayed in the list and then
click on ‘Finish’.
Step 9: Once the project is configured you can open the ‘Project Explorer‘ and start
exploring the Cloudsim project.
Following is the final screen which you will see after Cloudsim is configured
44
Step 10: Within the ‘Project Explorer‘, you should navigate to the ‘examples‘ folder,then
expand the package ‘org.cloudbus.cloudsim.examples‘ and double click to open the
‘CloudsimExample1.java‘.
Step 11: Now navigate to the Eclipse menu ‘Run -> Run‘ or directly use a keyboardshortcut‘Ctrl
45
Step 12: The following displays the output in the console window of the Eclipse IDE.
CloudSimExample1.java
The first step is to initialize the CloudSim package by initializing theCloudSim library, as
follows:
46
The fourth step is to create one virtual machine unique ID of the VM, userId ID of theVM’s
owner, mips, number Of Pes amount of CPUs, amount of RAM, amount of bandwidth,
amount of storage, virtual machine monitor, and cloudlet Scheduler policyfor cloudlets:
Create a cloudlet with length, file size, output size, and utilization model: Cloudlet cloudlet =
new Cloudlet(id, length, pesNumber, fileSize, outputSize,utilizationModel, utilizationMode
CloudSimExample1 finished!
RESULT:
Thus the procedure to simulate a cloud scenario using CloudSim is done successfully.
48
Ex.No: 3b
FILE TRANSFER FROM ONE VM TO ANOTHER VM
Date:
AIM:
To find the procedure to transfer the files from one virtual machine to another virtual
machine.
PROCEDURE:
Step1: Select the VM and click File->Export Appliance
Step7: Go to File->Computer:/home/sam/Documents/
RESULT:
Thus, the procedure to transfer the files from one virtual machine to another virtual machine
was executed successfully.
57
MODULE - IV
Ex. No: 4a
LAUNCHING VIRTUAL MACHINE USING TRY STACK
Date:
AIM:
PROCEDURE:
click Create.
59
Network Creation:
Instances Launch:
Network Topology:
63
Create a Router:
64
External Gateway:
Internal Interface:
65
Add Interface:
RESULT:
Thus, the procedure to launch virtual machine using trystack – OnlineOpenstack is
completed successfully.
68
Ex. No: 4b
DEVELOPING WEB APPLICATIONS IN CLOUD.
Date:
AIM:
PROCEDURE:
When you start Globus toolkit container, there will be number of services starts up. The
servicefor this task will be a simple Math service that can perform basic arithmetic for a
client.
It is possible to start with this interface and create the necessary WSDL file using the
standard Web service tool called Java2WSDL. However, the WSDL file for GT 4 has to
include detailsof resource properties that are not given explicitly in the interface above. Hence,
we will providethe WSDL file.
69
All the required files are provided and comes directly from [1]. The MathService source
code files can be found from http://www.gt4book.com
(http://www.gt4book.com/downloads/gt4book-examples.tar.gz)
A Windows zip compressed version can be found at
http://www.cs.uncc.edu/~abw/ITCS4146S07/gt4book-examples.zip. Download and
uncompress the file into a directory called GT4services. Everything is included (the java
source WSDL and deployment files, etc.):
WSDL service interface description file -- The WSDL service interface descriptionfile is
provided within the GT4services folder at:
GT4Services\schema\examples\MathService_instance\Math.wsdl
This file, and discussion of its contents, can be found in Appendix A. Later on we will need to
modify this file, but first we will use the existing contents that describe the Math service above.
Service code in Java -- For this assignment, both the code for service operations and for the
resource properties are put in the same class for convenience. More complex services and
resources would be defined in separate classes. The Java code for the service and its resource
properties is located within the GT4services folder at:
GT4services\org\globus\examples\services\core\first\impl\MathService.java.
Deployment Descriptor -- The deployment descriptor gives several different important sets of
information about the service once it is deployed. It is located within the GT4services folder
at:
GT4services\org\globus\examples\services\core\first\deploy-server.wsdd.
70
globus-deploy-gar org_globus_examples_services_core_first.gar
Successful output of the command is :
{
public static void main(String[] args)
{
try
{
String serviceURI = args[0];
// Create endpoint reference to service EndpointReferenceType endpoint = new
EndpointReferenceType(); endpoint.setAddress(new Address(serviceURI)); MathPortType
math;
// Get PortType
math = locator.getMathPortTypePort(endpoint);
// Perform an additionmath.add(10);
// Perform another additionmath.add(5);
// Access value System.out.println("Current value: "
+ math.getValueRP(new GetValueRP()));
// Perform a subtractionmath.subtract(5);
// Access value System.out.println("Current value: "
+ math.getValueRP(new GetValueRP()));
} catch (Exception e)
{e.printStackTrace();
}
}
}
When the client is run from the command line, you pass it one argument. The argument is the
73
URL that specifies where the service resides. The client will create the end point rerference
and incorporate this URL as the address. The end point reference is then used with the
getMathPortTypePort method of a MathServiceAdressingLocator object to obtain a
reference to the Math interface (portType). Then, we can apply the methods available in
the service as though they were local methods Notice that the call to the service (add and
subtract method calls) must be in a “try {} catch(){}” block because a “RemoteException”
may be thrown. The code for the “MathServiceAddressingLocator” is created during
the buildprocess. (Thus you don’t have to write it!)
org.globus.examples.clients.MathService_instance.Client
http://localhost:8080/wsrf/services/examples/core/first/MathService
which should give the output:
Current value: 15
Current value: 10
globus-undeploy-gar org_globus_examples_services_core_first
which should result with the following output:
Undeploying gar...Deleting /.
.
.
Undeploy successful
6 Adding Functionality to the Math Service
In this final task, you are asked to modify the Math service and associated files so the
srvicesupports the multiplication operation. To do this task, you will need to modify.
The exact changes that are necessary are not given. You are to work them out yourself. You
will need to fully understand the contents of service code and WSDL files and then modify
them accordingly. Appendix A gives an explanation of the important parts of these files. Keep
all file names the same and simply redeploy the service afterwards. You will also need to add
a code to the client code (Client.java) to test the modified service to include multiplication.
RESULT:
Thus, a new Web application in cloud was developed successfully.
75
MODULE - V
Ex. No: 5a
USING API’S OF HADOOP TO INTERACT WITH IT
Date:
AIM:
To write a program to use the API’s of Hadoop to interact with it to display filecontent
of a file existing in hdfs.
PROCEDURE:
/home/hduser/HadoopFScat.java:
import
java.io.InputStream;
import java.net.URI;
import rg.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
public class HadoopFScat
{
public static void main(String[] args) throws Exception
{
String uri = args[0];
Configuration conf = new Configuration();
FileSystem fileSystem = FileSystem.get(URI.create(uri), conf);
InputStream inputStream = null;
try
{
inputStream = fileSystem.open(new Path(uri));
IOUtils.copyBytes(inputStream, System.out, 4096, false);
}
finally
{
IOUtils.closeStream(inputStream);
}
}}
76
OUTPUT:
RESULT:
Thus a program to use the API’s of Hadoop to interact with it to display file content of
a file existing in hdfs is created and executed successfully.
78
Ex. No: 5b
WORD COUNT PROGRAM
Date:
AIM:
To write a word count program to demonstrate the use of Map and Reduce task.
PROCEDURE:
Step 1:
cs1-17@cs117-HP-Pro-3330-MT:~$ sudo su user
[sudo] password for cs1-17:
user@cs117-HP-Pro-3330-MT:/home/cs1-17$ cd\
user@cs117-HP-Pro-3330-MT:~$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
16/09/20 10:09:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable Starting namenodes on
[localhost]
localhost: starting namenode, logging to /usr/local/hadoop1/logs/hadoop-user-
namenode-cs117-HP-Pro-3330-MT.out
localhost: starting datanode, logging to /usr/local/hadoop1/logs/hadoop-user-datanode-
cs117-HP-Pro-3330-MT.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop1/logs/hadoop-user-
secondarynamenode-cs117-HP-Pro-3330-MT.out
16/09/20 10:10:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable starting yarn daemons
Starting resourcemanager, logging to /usr/local/hadoop1/logs/yarn-user-
resourcemanager-cs117-HP-Pro-3330-MT.out
localhost: starting nodemanager, logging to /usr/local/hadoop1/logs/yarn-user-
nodemanager-cs117-HP-Pro-3330-MT.out
user@cs117-HP-Pro-3330-MT:~$
jps9551 NodeManager
8924 NameNode
9857 Jps
79
9076 DataNode
9265 SecondaryNameNode
9420 ResourceManager
Step 2:
create a directory named ip1 on the desktop. in the ip1 directory create a two.txt file for
wordcount purpose. create a directory named op1 on the desktop.
user@cs117-HP-Pro-3330-MT:~$ cd /usr/local/hadoop1
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ bin/hdfs dfs -mkdir /user2
16/09/20 10:46:01 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ bin/hdfs dfs -put '/home/cs1-
17/Desktop/ip1' /user2
16/09/20 10:48:42 WARN util.NativeCobin/hadoop jar
share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar wordcount
/user1/inputdata output1 deLoader: Unable to load native-hadoop library for yourplatform...
using builtin-java classes where applicable
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ user@cs117-HP-Pro-3330-
MT:/usr/local/hadoop1$ bin/hdfs dfs -put '/home/cs1-17/Desktop/op1' /user2
6/09/20 11:02:01 WARN util.NativeCodeLoader: Unable to load native-hadoop libraryfor
your platform... using builtin-java classes where applicable
Step 3:
attempt_local1146489696_0001_m_000000_0
16/09/20 11:02:13 INFO mapred.LocalJobRunner: Starting task:
attempt_local1146489696_0001_m_000001_0
16/09/20 11:02:13 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
16/09/20 11:02:13 INFO mapred.MapTask: Processing split:
dfs://localhost:54310/user2/ip1/one.txt~:0+0
16/09/20 11:02:13 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
16/09/20 11:02:13 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
16/09/20 11:02:13 INFO mapred.MapTask: soft limit at 83886080
16/09/20 11:02:13 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
16/09/20 11:02:13 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
16/09/20 11:02:13 INFO mapred.MapTask: Map output collector class =
org.apache.hadoop.mapred.MapTask$MapOutputBuffer
16/09/20 11:02:13 INFO mapred.LocalJobRunner:
16/09/20 11:02:13 INFO mapred.MapTask: Starting flush of map output16/09/20
11:02:13 INFO mapred.Task:
Task:attempt_local1146489696_0001_m_000001_0 is done. And is in the process of
committing
16/09/20 11:02:13 INFO mapred.LocalJobRunner: map
16/09/20 11:02:13 INFO mapred.Task: Task 'attempt_local1146489696_0001_m_000001_0'
done.
16/09/20 11:02:13 INFO mapred.LocalJobRunner: Finishing task:
attempt_local1146489696_0001_m_000001_0
16/09/20 11:02:13 INFO mapred.LocalJobRunner: map task executor complete.
16/09/20 11:02:13 INFO mapred.LocalJobRunner: Waiting for reduce tasks
16/09/20 11:02:13 INFO mapred.LocalJobRunner: Starting task:
attempt_local1146489696_0001_r_000000_0
16/09/20 11:02:13 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
16/09/20 11:02:13 INFO mapred.ReduceTask: Using ShuffleConsumerPlugin:
org.apache.hadoop.mapreduce.task.reduce.Shuffle@b0a9ac0
16/09/20 11:02:13 INFO reduce.MergeManagerImpl: MergerManager:
memoryLimit=333971456, maxSingleShuffleLimit=83492864,
mergeThreshold=220421168, ioSortFactor=10,
memToMemMergeOutputsThreshold=10
82
Step 4:
Step 5:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ usr/local/hadoop1/bin/hadoop fs -cat
op1/result.txt
bash: usr/local/hadoop1/bin/hadoop: No such file or directory
85
Step 6:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ usr/local/hadoop1/bin/hadoop fs -cat
op1/*
bash: usr/local/hadoop1/bin/hadoop: No such file or directory
Step 7:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ /usr/local/hadoop1/bin/hadoop fs -cat
op1/*
16/09/20 11:05:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
hello 3
helo 1
world 3
Step 8:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ /usr/local/hadoop1/bin/hadoop fs -cat
op1/result.txt
16/09/20 11:06:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
cat: `op1/result.txt': No such file or directory
Step 9:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ /usr/local/hadoop1/bin/hadoop fs
op1/result.txt
op1/result.txt: Unknown command
Step 10:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ /usr/local/hadoop1/bin/hadoop fs -cat
op1/>>result.txt
16/09/20 11:06:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
cat: `op1': Is a directory
Step 11:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ /usr/local/hadoop1/bin/hadoop fs -cat
86
>> op1/result.txt
bash: op1/result.txt: No such file or directory
Step 12:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ stop-all.sh
This script is Deprecated. Instead use stop-dfs.sh and stop-yarn.sh
16/09/20 11:11:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
Stopping namenodes on
[localhost]localhost: stopping
namenode localhost: stopping
datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
16/09/20 11:12:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
stopping yarn daemons
stopping
resourcemanager
localhost: stopping
nodemanagerno proxyserver to
stop
Step 13:
user@cs117-HP-Pro-3330-MT:/usr/local/hadoop1$ cd\
>
user@cs117-HP-Pro-3330-MT:~$
Wordcount.java
//package org.myorg;
import java.io.IOException;
import
java.util.StringTokenizer;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
87
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import
org.apache.hadoop.mapreduce.lib.input.TextInputFormat; import
org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
public class WordCount {
public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> {private
final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(LongWritable key, Text value, Context context) throws
IOException, InterruptedException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);while
(tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());context.write(word,
one);
}
}
}
public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> {
public void reduce(Text key, Iterable<IntWritable> values, Context context)throws
IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
context.write(key, new IntWritable(sum));
}}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
Job job = new Job(conf, "wordcount");
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
88
job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.waitForCompletion(true);} }
RESULT:
Thus, a word count program to demonstrate the use of Map and Reduce task is created
and executed successfully.
89
AIM:
PROCEDURE:
0 100
RESOURCE USAGE & QUOTAS
NUMBER OF VMS MEMORY CPU
VOLATILE_SIZE0 / 10 0M / 0M 0.00 /
0.00 0M / 0M
Prepare Virtual Resources for the Users
At this point, the cloud administrator can also prepare working Templates andImages for the
vDC users.
$ onetemplate chgrp ubuntu web-dev
RESULT: