My First WordCamp Kerala: More Than Just a Tech Conference

On November 9, 2024, I had the opportunity to attend my first WordCamp Kerala, a WordPress conference that was nothing short of extraordinary. It’s funny to think that, despite using WordPress for the past five years, this was my first time immersing myself in the vibrant WordPress community in person. The excitement was palpable, and the day was filled with insights, connections, and newfound inspiration.

The ticket for the event was reserved by Manoj Karingamadathil, a digital humanitarian and open-access activist whose multifaceted work greatly inspired me. Manoj’s dedication to biodiversity conservation, the preservation of classical Malayalam literature, and his advocacy for free and open-source software deeply align with the values I cherish.

Manoj’s connection to my world runs deeper than just our shared interests. He is the life partner of Jameela P., a Library and Information Assistant at the Indian Naval Academy and someone I’ve exchanged ideas with over the years. Even though I haven’t had the chance to meet Jameela in person yet, I feel a sense of kinship with her and look forward to future programs where we might finally connect face-to-face.

Another remarkable individual I met was Kunjappy. The moment I saw him, I was reminded of my sadu (my sister’s son), with his boundless energy and a spirit that seemed as unstoppable as a revving engine ready to conquer new horizons. It’s amazing how meeting people like Manoj reminds me of the importance of simplicity and shared vibes.

Attending WordCamp Kerala wasn’t just about learning the latest in WordPress development and community engagement. It became a moment of realization: as a library professional, my contributions may be meaningful, but there is also so much more to be done as a citizen of this world. Being surrounded by people who actively work toward a better society has reignited a spark in me to think beyond the boundaries of my profession.

Reflecting on this experience, I feel incredibly grateful. The WordCamp Kerala event wasn’t just a gathering of WordPress enthusiasts but a confluence of individuals who are passionate about making a difference.

Snapshots from #WordCampKerala2024

Automating WordPress Backups with a Shell Script

In the digital age, data is invaluable, especially for websites that rely on content management systems like WordPress. Regular backups are essential to safeguard your website against data loss due to various reasons, including server failures, malware attacks, or accidental deletions. In this post, we’ll walk through a shell script designed to automate the backup process for a WordPress site, ensuring you have secure copies of your database and files.

Why Backup Your WordPress Site?

Backups serve as your safety net. They allow you to quickly restore your site to a previous state in case of issues. Here are some key reasons to regularly back up your WordPress site:

  • Data Loss Prevention: Accidental changes or deletions can happen. A backup allows you to revert to a working version.
  • Malware Recovery: If your site gets compromised, having a clean backup is crucial for recovery.
  • Server Failures: Hardware or software failures can lead to data loss. Backups ensure you can restore your site.

The Backup Script

Here’s a breakdown of a simple yet effective backup script for a WordPress site:

#!/bin/bash

# 1: Define backup filenames with timestamps.
db_backup_name="wp-db-backup-$(date +%d-%m-%Y-%H.%M).sql.gz"
wpfiles_backup_name="wp-content-backup-$(date +%d-%m-%Y-%H.%M).tar.gz"

# 2: Database connection info.
db_name="wordpress"
db_username="wordpress"
db_password="wordpress123"

# 3: Path to WordPress root directory and wp-content.
wp_root_folder="/var/www/html/wordpress"
backup_folder_path="/home/mahesh/backups/wordpress"

# 4: Backup MySQL database.
mysqldump --opt -u"$db_username" -p"$db_password" "$db_name" | gzip > "$backup_folder_path/$db_backup_name"

# 5: Navigate to WordPress root and create a tarball of wp-content only.
cd "$wp_root_folder" || exit
tar -czf "$backup_folder_path/$wpfiles_backup_name" wp-content

# 6: Delete all but 3 most recent database backups.
find "$backup_folder_path" -maxdepth 1 -name "*.sql.gz" -type f -printf "%T@ %p\n" | sort -rn | awk 'NR>3 {print $2}' | xargs -r rm --

# 7: Delete all but 3 most recent wp-content backups.
find "$backup_folder_path" -maxdepth 1 -name "*.tar.gz" -type f -printf "%T@ %p\n" | sort -rn | awk 'NR>3 {print $2}' | xargs -r rm --

Download the Script: gist.githubusercontent.com/maheshpalamuttath/482f1e43bc170d822fc4b19f368cd655/raw/8e2e96e40b2159aece20b6a0143f9383723b8317/wp-backup.sh

Breakdown of the Script

1. Defining Backup Filenames
The script starts by defining the names for the backups with a timestamp, ensuring that each backup is uniquely named. This prevents any accidental overwriting of existing backups.

db_backup_name="wp-db-backup-$(date +%d-%m-%Y-%H.%M).sql.gz"
wpfiles_backup_name="wp-content-backup-$(date +%d-%m-%Y-%H.%M).tar.gz"

2. Database Connection Information
Next, the script defines the database connection information, including the database name, username, and password. This allows the script to connect to the MySQL database to create a backup.

db_name="wordpress"
db_username="wordpress"
db_password="wordpress123"

Security Note: It is advisable to avoid hardcoding sensitive information directly into scripts. Consider using environment variables or a configuration file that has restricted access.

3. Specifying Directories
The paths to the WordPress root directory and the backup location are specified. This ensures that the script knows where to look for the files to back up and where to store the backup files.

wp_root_folder="/var/www/html/wordpress"
backup_folder_path="/home/sasc/backups/mahesh/wordpress"

4. Backing Up the MySQL Database
The mysqldump command creates a backup of the WordPress database, compressing it with gzip to save space. The output is redirected to a file named based on the current timestamp.

mysqldump --opt -u"$db_username" -p"$db_password" "$db_name" | gzip > "$backup_folder_path/$db_backup_name"

5. Creating a Tarball of wp-content
The script changes the directory to the WordPress root and creates a tarball of the wp-content directory, which contains all your themes, plugins, and uploaded files.

cd "$wp_root_folder" || exit
tar -czf "$backup_folder_path/$wpfiles_backup_name" wp-content

6. Cleaning Up Old Backups
To manage disk space, the script includes commands to delete old backups, keeping only the three most recent database backups and wp-content backups.

find "$backup_folder_path" -maxdepth 1 -name "*.sql.gz" -type f -printf "%T@ %p\n" | sort -rn | awk 'NR>3 {print $2}' | xargs -r rm --
find "$backup_folder_path" -maxdepth 1 -name "*.tar.gz" -type f -printf "%T@ %p\n" | sort -rn | awk 'NR>3 {print $2}' | xargs -r rm --

To set up this backup script on your server to run at regular intervals, you can use cron, the default job scheduler on Linux. Here’s a step-by-step guide:

Save your backup script with a .sh extension in a location that you can access. For example:

sudo su
vim /usr/local/bin/wp_backup.sh

Copy the script into this file, save it, and exit.

Give the script execute permissions so it can run:

chmod +x /usr/local/bin/wp_backup.sh

Run the script manually to ensure it works as expected:

/usr/local/bin/wp_backup.sh

If there are any errors, they should show up now, and you can troubleshoot as needed.

Use cron to automate the backup process. To edit the cron jobs:

crontab -e

Add a new line at the end of the file to specify when you’d like the backup to run. For example, to run the backup script every day at 2:00 AM, add this line:

0 2 * * * /usr/local/bin/wp_backup.sh >> /home/sasc/logs/wp_backup.log 2>&1

This command means:

0 2 * * * - Run at 2:00 AM every day.
/usr/local/bin/wp_backup.sh – Path to your backup script.
>> /usr/local/bin/logs/wp_backup.log 2>&1  – Save output and errors to a log file for troubleshooting.

If you’re using a log file, create a directory for it (Optional):

mkdir -p /usr/local/bin/logs

After adding the cron job, you can verify it with:

crontab -l

Your backup script is now set to run automatically on your server specified interval!

Use Rclone for Synching backup files into Google Drive: https://libtechnophile.blogspot.com/2020/07/using-google-drive-on-debianubuntu-with.html

Automating your WordPress backups with a shell script is a practical solution to ensure your data is safe and easily restorable. By regularly running this script, you can avoid the headaches of data loss and keep your site secure. Remember to periodically check your backup files and test the restoration process to ensure everything works as expected.

 

Why Debian is My Top Pick for a Linux Environment

I choose Debian for my Linux environments because it is known for being stable and reliable. When I set up servers or systems, I want to be sure they won’t have unexpected problems. Debian has a “Stable” release that goes through thorough testing, which gives me confidence that my systems will work smoothly.

Security is another important reason I use Debian. The Debian security team actively monitors and fixes vulnerabilities, so my systems stay safe without major interruptions. This proactive approach is reassuring, especially when handling sensitive data in environments like libraries.

The strong community around Debian is also a big plus. Since Debian has been around since 1993, it has built a supportive community where I can find help and resources easily. Many other Linux distributions, like Ubuntu, are based on Debian, so there is a wealth of shared knowledge available.

Debian offers a huge selection of software packages—over 59,000! This means I can find almost any tool I need without looking for external sources. It also runs efficiently, even on older computers, allowing me to make the most of my hardware.

Managing software is straightforward with Debian’s package management system called APT (Advanced Package Tool). With simple commands, I can install, update, or remove software, and APT takes care of any dependencies automatically. This saves me a lot of time and hassle.

For projects that need to last a long time, Debian provides long-term support (LTS). This means I can receive important security updates for older versions of Debian, allowing me to keep my systems running without frequent upgrades.

Lastly, Debian is committed to open-source principles. This ensures that the development process is transparent and ethical, aligning with my values as a technology enthusiast.

In summary, Debian’s stability, security, supportive community, vast software options, and commitment to open-source make it an excellent choice for my Linux environments. If you’re new to Linux and looking for a dependable distribution, Debian is definitely worth considering. It’s a reliable option that allows you to focus on your projects without unnecessary worries.

The Open-Source Software I Self-Host for My Workflow

Open-source software has become a cornerstone of my work as a library technologist. It offers flexibility, community support, and the freedom to customize tools for specific needs—all without the hefty price tags of proprietary alternatives. In this blog post, I’ll share some of the key open-source software solutions I rely on daily, spanning areas like website management, remote desktop access, automation, invoicing, and more.

1. WordPress: Powering My Website

WordPress, one of the world’s most popular content management systems (CMS), is my go-to for managing my personal website. Its vast array of plugins, ease of customization, and active community support make it a versatile tool. Whether it’s customizing themes, integrating SEO features, or managing content, WordPress provides the flexibility I need.

2. InvoicePlane: Simplifying Invoicing and Billing

For invoicing, I use InvoicePlane, an open-source application that allows me to create, track, and manage invoices efficiently. It offers all the essential features I need for handling client information, generating quotes, and managing payments. With customizable invoice templates, it also ensures my documents look professional, and being self-hosted gives me full control over my financial data.

3. n8n: Automating My Workflow

n8n is a workflow automation tool that connects various services, enabling me to automate repetitive tasks. From syncing data between Google Sheets and my library system to sending notifications via WhatsApp, it streamlines workflows across the different platforms I use. With its visual interface, n8n makes automating tasks simple and intuitive, even for non-developers.

4. RustDesk: My Go-To for Remote Desktop Access

For remote desktop access, I rely on RustDesk, an open-source alternative to tools like AnyDesk or TeamViewer. What sets it apart is the option to self-host the server, which gives me full control over my data and privacy. It allows me to remotely manage systems securely and efficiently, offering cross-platform support, including mobile devices.

5. Nginx Proxy Manager: Simplifying Reverse Proxy Setup

Managing my web applications is made easier with Nginx Proxy Manager. It simplifies the setup of reverse proxy servers, handling SSL certificates, redirects, and multiple applications under one domain. This tool ensures secure and flexible management of web services with a user-friendly interface, making it an excellent solution for streamlining access to various tools I run.

6. Uptime Kuma: Monitoring My Services

To ensure that my web services and servers are always up and running, I use Uptime Kuma. This open-source monitoring tool provides real-time tracking of service uptime, sending notifications if anything goes offline. It’s lightweight, self-hosted, and has a modern interface, making it a convenient way to monitor multiple services from one dashboard.

7. Vaultwarden: Managing My Passwords Securely

For password management, I use Vaultwarden, a self-hosted alternative to Bitwarden. It ensures that all my passwords are securely encrypted, while allowing me to retain full control over my data. With support for browser extensions and mobile apps, Vaultwarden integrates seamlessly across my devices, making password management both secure and convenient.

8. Stirling PDF: A Reliable PDF Toolkit

Stirling PDF is my go-to tool for managing PDFs. It supports tasks like editing, merging, converting, and securing PDF documents, which is crucial for handling student records, research papers, and invoices. As an open-source, offline tool, it offers a reliable and private solution for managing documents in a professional setting.

9. Homepage: A Custom Dashboard for My Digital Workspace

Homepage acts as a personal dashboard where I organize links, tools, and widgets in one place. It serves as a centralized hub for quick access to essential services like my blog admin panel, n8n automations, and Koha library management system. With its clean design and ease of customization, it enhances productivity by putting everything I need within easy reach.

Why Open Source Matters to Me

The common thread across all these tools is their open-source nature. Open-source software fosters innovation, offers flexibility, and ensures that users remain in control of their data and systems. I love the freedom it gives me to tweak and customize software to meet my specific needs. Moreover, the support and collaboration of the open-source community make it a vibrant space where new ideas flourish.

As I continue exploring and experimenting with new tools, open-source software remains at the core of my workflow. These tools are helping me stay efficient, maintain control over my digital ecosystem, and provide better services to my community. If you’re not already embracing open-source solutions, I highly recommend giving them a try—you might just find they become indispensable in your own workflow!

Support the Open-Source Community

Open-source projects thrive on the contributions and support from their communities. If you find any of these tools as useful as I do, I encourage you to consider contributing financially to the developers behind these projects. Even small donations help sustain development, keep servers running, and fund new features. You can find donation links or ways to support these projects on their respective websites:

Getting Started with Docker: A Beginner’s Guide

In today’s fast-paced development world, the ability to deploy applications quickly and consistently across various environments is crucial. Enter Docker—a powerful platform that allows developers to automate the deployment of applications inside lightweight, portable containers. This post will introduce you to the basics of Docker, its editions, and essential commands to get you started.

What is Docker?

Docker is an open-source platform that enables developers to automate the deployment of applications within containers. Containers are lightweight, portable units that package an application and its dependencies, ensuring that it runs consistently across different computing environments. Unlike virtual machines, containers share the host operating system’s kernel, making them faster and more efficient.

Key Benefits of Using Docker

  • Portability: Docker containers can run on any machine that has Docker installed, making it easy to move applications between environments.
  • Isolation: Each container runs in its own isolated environment, preventing conflicts between applications and their dependencies.
  • Scalability: Docker makes it simple to scale applications up or down by adding or removing containers as needed.
  • Efficiency: Containers use fewer resources than traditional virtual machines, allowing for better utilization of system resources.

Docker Editions: CE and EE

Docker offers two main editions: Docker Community Edition (CE) and Docker Enterprise Edition (EE). Each serves different user needs and environments.

Docker Community Edition (CE)

Docker CE is the free, open-source version of Docker. It’s primarily designed for individual developers and small teams who want to build, ship, and run applications in containers. Key features of Docker CE include:

  • Open Source: Being community-driven, it has a vast ecosystem and support from developers worldwide.
  • Frequent Updates: Regularly updated with new features, bug fixes, and improvements.
  • Ideal for Development: Great for prototyping and testing applications locally before deploying them.

Docker Enterprise Edition (EE)

Docker EE is a commercial version designed for organizations that require a more robust and secure container management solution. It includes advanced features such as:

  • Enterprise-Grade Security: Enhanced security features, including role-based access control and security scanning for images.
  • Management Tools: Integrated management tools for orchestrating container deployments at scale.
  • Support and SLAs: Access to Docker’s professional support, including Service Level Agreements (SLAs) for uptime and performance.
  • Integration with Existing Infrastructure: Seamless integration with various enterprise environments, including private clouds and on-premises servers.
  • Docker EE is ideal for large organizations and production environments that prioritize security, scalability, and management capabilities.

Installing Docker

To start using Docker, you need to install it on your machine. You can download Docker Desktop for Windows and macOS from the Docker website. For Linux users, you can follow the installation instructions specific to your distribution.

For Debian: gist.githubusercontent.com/maheshpalamuttath/efcd866d698f3ada694204a4e72311d3/raw/e4070a06b8f3e5438fb6fd00e4369cf5c9001c1d/install_docker_on_debian.sh

For Ubuntu: gist.githubusercontent.com/maheshpalamuttath/eee1df5b90ff35733b8229c18b3dcf0e/raw/0d51cb6da0f0c4d86566b8bbbe687a43f16f44d4/install_docker_on_ubuntu.sh

Basic Docker Commands

Once you have Docker installed, you can start using it with some basic commands. Here’s a list of essential Docker commands to get you started:

1. Check Docker Installation

To verify that Docker is installed correctly, run:

docker --version

This command displays the installed version of Docker.

2. Run a Docker Container

To run a simple container, you can use the following command:

docker run hello-world

This command pulls the hello-world image from Docker Hub and runs it in a container. If everything is set up correctly, you’ll see a success message.

3. List Docker Containers

To view the containers currently running on your system, use:

docker ps

To see all containers, including those that are stopped, add the -a flag:

docker ps -a

4. Stop a Running Container

To stop a running container, you need its container ID or name:

docker stop <container_id_or_name>

5. Remove a Container

To remove a stopped container, use:

docker rm <container_id_or_name>

6. List Docker Images

To see the images available on your system, use:

docker images

7. Remove an Image

To remove an image, you can use:

docker rmi <image_id_or_name>

8. Pull an Image from Docker Hub

To download an image from Docker Hub, use:

docker pull <image_name>

For example, to pull the latest version of the nginx image:

docker pull nginx

9. Build an Image

To build a Docker image from a Dockerfile, navigate to the directory containing the Dockerfile and run:

docker build -t <image_name>

The -t flag allows you to tag the image with a name.

10. Run a Detached Container

To run a container in the background (detached mode), use the -d flag:

docker run -d <image_name>

Docker is a powerful tool that simplifies the deployment process for applications by encapsulating them in containers. With its Community and Enterprise editions, it caters to different user needs, from individual developers to large organizations.

Enroll in the 4th Batch of the Diploma in Library Automation!

Are you ready to take the next step in transforming library services with cutting-edge technology? We’re excited to announce the launch of the 4th batch of the Diploma in Library Automation at Rajagiri College of Social Sciences (Autonomous), Kochi, starting from November 3rd, 2024! This program is ideal for library professionals seeking to modernize their skills and enhance their careers in the ever-evolving digital landscape.

About the Course

The Diploma in Library Automation is designed specifically for those in the library field who aspire to integrate technology into their daily operations. This course provides both foundational knowledge and practical experience in essential software and tools that are shaping the future of libraries. Whether you’re an experienced librarian or new to the field, this course will empower you to become a tech-savvy professional capable of adapting to the latest trends in library automation.

What Sets Us Apart

  • Expert Instruction: Learn from seasoned professionals who are experts in the field.
  • Comprehensive Curriculum: Our syllabus includes modules on Linux, Koha ILS, and DSpace IR.
  • Hands-On Learning: Gain practical skills in installation, customization, and maintenance through hands-on exercises.
  • Flexible Schedule: Classes are held on the first and third Sundays of each month, making it easy to balance with your professional commitments.
  • Course Duration: 60 hours over six months, with classes running from 10 AM to 4 PM on scheduled Sundays.

Course Highlights

  • Linux: Explore open-source Linux distributions, system setup, and configuration, focusing on tools that streamline library operations.
  • Koha ILS: Dive into the Koha Integrated Library System, learning about its modules, functionality, and how to maintain an efficient library system.
  • DSpace IR: Discover how to install and manage DSpace, a digital repository platform ideal for organizing and sharing digital collections.
  • Deployment of Web Apps on Cloud Platforms: Learn how to deploy, manage, and scale web applications on cloud platforms, enhancing accessibility and performance for library services.

View Syllabus

Contact for Registration and More Details

For more information on the program and to secure your spot, reach out to Dr. Joshy George (Associate Professor, DLIS & Program Coordinator) at 8606365566. He’ll be happy to guide you through the registration process and answer any questions you may have.

Don’t miss this opportunity to become a leader in the digital transformation of library services. Enroll now to be part of the 4th batch of the Diploma in Library Automation and make a lasting impact on your library and career!

First Experience Growing Microgreens in My Apartment

As someone who loves fresh, healthy food but lives in an apartment, I’ve always been on the lookout for ways to bring some homegrown produce into my urban life. When I discovered microgreens—tiny, nutrient-packed greens—I was immediately intrigued. These miniature plants seemed like the perfect solution for adding a fresh, homegrown touch to my meals. After a bit of research and a lot of excitement, I decided to dive in and give it a try. Here’s a chronicle of my first experience growing microgreens in my cozy apartment.

Getting Started

The first step in my microgreen journey was gathering all the necessary supplies. Here’s what I used:

  • Seeds: I chose a bunch of Mung Bean/Green Gram seeds to start.
  • Growing Trays: Shallow trays with drainage holes.
  • Tissue Paper: Instead of soil, I opted for tissue paper as a growing medium.
  • Spray Bottle: For watering the seeds.

With my supplies ready, I set up a small growing station in my kitchen, transforming a corner into a mini-indoor garden.

Planting the Seeds

I lined the trays with several layers of tissue paper, ensuring it was evenly spread and could retain moisture. Then, I sprinkled the seeds over the tissue surface, trying to distribute them as evenly as possible. After pressing them gently into the tissue, I misted the surface with water from my spray bottle. This initial setup was simple and satisfying, setting the stage for my microgreens to thrive.

Daily Care

Over the next few days, I kept a close eye on my microgreens. I made sure the tissue paper stayed moist but not waterlogged, misting it with water twice a day.

I was amazed at how quickly the seeds began to sprout! Within just a few days, tiny green shoots started emerging from the tissue paper. Watching them grow a little bit each day was incredibly satisfying and filled me with a sense of accomplishment.

Harvesting

After about two weeks, my microgreens were ready to harvest. The first true leaves had developed, and they looked vibrant and healthy. Using a pair of scissors, I carefully cut the greens just above the tissue paper. The sight of those fresh, homegrown greens was a real delight.

Enjoying the Fruits (Greens) of My Labor

The best part of growing microgreens is, of course, eating them! I added my fresh, homegrown microgreens to salads also to prepare stir-fry (thoran), etc.

Lessons Learned

  • Start Small: Begin with a few varieties of seeds to keep things manageable.
  • Consistency is Key: Regular watering and proper lighting are crucial for healthy growth.
  • Experiment and Enjoy: Don’t be afraid to try different seeds and recipes.
  • Each type of microgreen brings its own unique flavor and nutritional benefits.

Growing microgreens in my apartment was a rewarding and enjoyable experience. It was easier than I anticipated, and the fresh greens added a delicious, healthy touch to my meals. Whether you’re an urban dweller like me or just looking for a fun, sustainable way to grow your own food, I highly recommend giving microgreens a try. Happy growing!

 

Why Some Linux Users Refuse to Use Snap Package Manager

Snap, a universal packaging system from Canonical, the company behind Ubuntu, has been a topic of debate in the Linux community. While it offers some significant advantages, such as simplified software installation and automatic updates, it has also been met with resistance. Here are some reasons why some Linux users refuse to use Snap.

Proprietary Backend

One of the main criticisms of Snap is that its backend is proprietary and controlled by Canonical. This means that if you want to distribute snaps, you have to set up an account with Canonical and host it there. This has led to concerns about centralization and control, as it goes against the decentralized nature of open-source software development.

Performance Issues

Another common complaint about Snap packages is their performance. Snaps are often slower to install, slower to start, and use more RAM and disk space than traditional packages. This is because Snap packages contain all the dependencies needed for the software to run, which increases their size and can slow down their operation.

Auto-updates

Snap packages auto-update themselves without asking the user. While this can be seen as a benefit in terms of keeping software up-to-date, it can also be a disadvantage. For example, an automatic update could potentially break a working system or use up bandwidth without the user’s knowledge.

System Integration

Snaps do not always respect system settings and can be restricted compared to their non-snap counterparts. This can lead to inconsistencies in the user experience and limit the functionality of the software.

Community Adoption

Community adoption of Snap has been mixed. Some Linux distributions still prefer to use existing packaging formats (e.g., DEB and RPM) and package managers (e.g., APT and DNF) or their alternative, Flatpak. This can lead to fragmentation and confusion among users.

Snap has its advantages, these issues have led some Linux users to refuse to use it. It’s important to note that these are not inherent flaws in the Snap system, but rather concerns that have been raised by the community. As with any technology, it’s essential to understand its strengths and weaknesses to make an informed decision about whether to use it.

References:

 

Go Minimal, Go Xubuntu: Unveiling the Leanest Ubuntu Experience Yet! 

Yesterday marked an exciting milestone for Linux enthusiasts as Ubuntu 24.04 LTS, codenamed “Noble Numbat,” made its debut. Among the plethora of editions and flavors available, one particular release stands out for those who prefer a lean, tailored desktop experience: Xubuntu Minimal.

Gone are the days when users had to strip down their Ubuntu installations manually to achieve a lightweight environment. With the official inclusion of Xubuntu Minimal as a subproject, Ubuntu now offers a refined, slimmed-down version that caters to users seeking a desktop with no preinstalled applications.

What is Xubuntu Minimal?

Xubuntu Minimal, formerly known as Xubuntu Core, is a minimal ISO edition that offers users the essential features of the Xubuntu desktop environment without any unnecessary bloat. With a compact size of just 2.45GB, it offers a foundation upon which users can build their ideal desktop environment, tailored to their specific preferences and requirements.

Download Xubuntu Minimal

http://ubuntutym2.u-toyama.ac.jp/xubuntu/24.04/release/xubuntu-24.04-minimal-amd64.iso

https://torrent.ubuntu.com/xubuntu/releases/noble/release/minimal/xubuntu-24.04-minimal-amd64.iso.torrent

If you are downloading Xubuntu minimal torrent file from a Linux system run following

cd Downloads
sudo apt install transmission-cli
transmission-cli xubuntu-24.04-minimal-amd64.iso.torrent

From Windows

Download and install any Torrent client software like

qBittorrent

What should I do after installing Xubuntu Minimal?

sudo apt update && sudo apt upgrade -y
sudo apt install -y thunar-archive-plugin wget vim gpg curl bzip2 firefox
sudo reboot

To know new features in Ubuntu 24.04 LTS, click here.

A Journey Through Library Internship: A Tale of Enthusiastic Learners

As the 15-day library internship programme at Sacred Heart College Library draws to a close, it’s time to reflect on the enriching journey that everyone involved has shared. Two bright interns, Ms. Sathisa and Ms. Aleena, hailing from the Department of Library and Information Science at St. Peter’s College, Kolenchery, embarked on this learning adventure with eagerness and curiosity.

Guiding them through the intricacies of library operations were two librarians, Mr. Biju VV (UGC librarian), myself, and my team from the East Campus Learning Resource Centre. From the outset, it was evident that Sathisa and Aleena were determined to grasp every aspect of library management, from the fundamentals to the nuances of the profession.

The internship was meticulously structured, with the initial seven days spent at the East Campus Learning Resource Centre and the subsequent days at the Central Library. This division allowed them to immerse themselves in different library environments, broadening their understanding of the diverse roles libraries play in academic settings.

During their tenure, they gained exposure to various facets of library management. From book acquisition to meticulous register maintenance, they gained insights into the organisation that underpins effective library functioning. Hands-on experience with library software enhanced their proficiency in tasks such as circulation management and cataloguing in Koha.

Moreover, they delved into the realm of institutional repositories, familiarising themselves with DSpace software for efficient management of digital assets.

Beyond technical skills, we guided them on essential aspects of professional development. They learned the ropes of navigating job vacancies, preparing for interviews, and understanding the responsibilities that come with being a librarian. This holistic approach ensured that they not only acquired technical expertise but also developed the soft skills necessary for success in the field.

As the internship draws to a close, Sathisa and Aleena depart with a wealth of knowledge and experiences that will undoubtedly shape their future endeavours in the field of library and information science.

– Mahesh Palamuttath