How to Upload Projects on Google Cloud

How to Upload Projects on Google Cloud

Google Cloud Storage is an interactive, cost-effective, and powerful solution to storage in the modern-day environment, especially for unstructured objects. The following tutorial is tailored to make you understand how to sufficiently and quickly store objects and data in Google cloud storage. Following the below simple steps will enable you to store data on cloud storage with ease; 

  • Create a bucket: buckets are the facilities that store any type of files that you transfer to the cloud storage.
  • Upload and share object: make use of the created bucket through uploading data and objects, and making the same available to the public. 

Creating a bucket 

Step one involves creating a bucket where you go to the upper-left corner of the console and open the navigation menu. Select storage and click create a bucket. There is an option of specifying the bucket’s properties where you choose your values. The values may all be default except for the name. The following are things to consider;

  • The name chosen ought to be unique since this is a global platform. You will most definitely get an error text if you choose a similar name of an already existing bucket in the Google cloud storage. 
  • Default class: default storage class is given to the bucket as per the name suggests, and this is by default. This choice will depend on how frequent your data and objects will be accessed, and if your data services worldwide. These categories affect your cost.
  • Location: it is advisable to keep your data close to the users and the applications. This, again, will be determined by the class of storage selected. 
  • Then finish by clicking the create button. 

Summary of Creating A Bucket 

Thus, you begin by creating a bucket, name the bucket by picking a unique permanent name; this name should not contain sensitive information as it will be shared publicly. Select the default storage class. Choose the kind of control you would want to access your objects and then create. 

Uploading Files 

After creating your bucket, now you can begin uploading data and objects on google cloud storage. In the created bucket you can upload any type of data by following the procedure below, 

At the Bucket details page, there is an upload files section where you click on upload files, and this will give you an option to select a file you want to upload. Then drag the data to space underneath the bucket name.

To Share an Object Publicly 

After uploading your file and project you may want to share the project, this can be achieved by making the URL for the data accessible to the public; 

For individual projects, the following is applicable:

  1. In the chosen file, click the drop-down menu which is found in the three vertical dots at the far right,
  2. Click the edit permissions, 
  3. an overlay will appear and from there select the +Add item icon,
  4. the new row that will be created then choose the user from the entry column, all users in the name column and reader in the access column,
  5. finish by clicking save. 

Groups of Objects 

  • Click the cloud storage browser (open the cloud storage browser),
  • Click the bucket you want to be shared publicly,
  • click permissions tab that is almost at the top of the page 
  • select the add members icon for the add members dialog to appear,
  • enter all users in all the new members’ field,
  • select the storage sub-menu in the roles drop-down, and click storage object viewer,
  • finish by saving.

After clicking save, in the public access column, a link should appear for the projects selected. By clicking on the icon, one gets the URL for the project. 

Delete Objects 

Return to the buckets level by clicking the buckets link, select the bucket, click on the checkbox next to the folder, select delete and click ok to delete. 

Cloud Storage with gsutil command-line tool.

As long as you have google cloud SDK on your remote server or workstation, you can easily copy files from and to the cloud storage buckets. Thus, you can either do it from remote or local. 

You must install Cloud SDK, which runs on Windows, macOS, and Linux. It asks for Python 2.7.9 and above. Some tools such as Java for Google APP Engine request for additional requirements. 

Cloud SDK Installation 

You can install Cloud SDK by following either of the following options: 

  • continuously integrating or using cloud SDK with scripts,
  • running Ubuntu/ Debian 
  • need to run Cloud SDK as a Docker
  • for macOS and Windows, you can run the interactive installer

the above methods of installing cloud SDK enables you to have the default Cloud SDK machinery, including bq, gsutil, and gcloud command-line tools. 

After installing the google SDK you can now continue with the upload process using the gsutil cp command; gsutil cp [project-location] gs: // [destination_name_of_the_bucket] / where; 

  • the project location is the local path to your project, file name, for example, Desktop/picture album/ bmw.png.
  • destination_name_of_the_bucket is the unique name you gave the bucket where you will be uploading your files.

When the system confirms your request as successful, it should give you the following response: 

The operation completed over one of the number of projects if they are more than one /68.8 KiB.

Uploading with Google Scripts 

This method allows you to have Google Apps Scripts that automatically upload files as per command set, or you can manually upload data as well. 

You will need to open the Google Developer’s console and enable JSON DATA API, the Google Cloud Storage, and the Google Cloud Storage API. To access Google Cloud Storage, you need to enable billing. Then generate the API keys and incorporate the redirect URL, which has your Google Apps Script project key. 

Migrating an already set up business operation to GCP will require transferring large files to the cloud storage. Cloud storage is observed to be a durable and highly available object storage platform that has no limitations on the number of archives stored. However, it does limit the size to 5 TB. Thus, before you transfer, one must know the amount of data to be transferred, where this data is located, and the frequency of transfers.

Author Bio

Juan Koss – I am graduate of the University of California, Los Angeles. I specialize in PhD writing, math and IT science in Write My Essay company. I am proficient in paper writing and would be happy to help in any way I can!


Greetings, I'm Devendra Dode, a full-stack developer, entrepreneur, and the proud owner of My passion lies in crafting informative tutorials and offering valuable tips to assist fellow developers on their coding journey. Within my content, I cover a spectrum of technologies, including PHP, Python, JavaScript, jQuery, Laravel, Livewire, CodeIgniter, Node.js, Express.js, Vue.js, Angular.js, React.js, MySQL, MongoDB, REST APIs, Windows, XAMPP, Linux, Ubuntu, Amazon AWS, Composer, SEO, WordPress, SSL, and Bootstrap. Whether you're starting out or looking for advanced examples, I provide step-by-step guides and practical demonstrations to make your learning experience seamless. Let's explore the diverse realms of coding together.

Leave a Reply

Your email address will not be published. Required fields are marked *