Server Setting

·

6 min read

Home Tech Setup

Objective: The goal is to document the technical environment and setup used at home, providing inspiration and guidance for readers who may want to create a similar setup.
Expectations: As a Data Scientist, some aspects of the setup may be incomplete or not fully optimized. Feedback is always welcome, and I plan to update the content continuously based on suggestions and improvements.

Overview

1. Introduction

  • Background for writing this blog post

    • While setting up various technical environments at home, I realized how challenging it can be to locate related resources without proper documentation.

    • I aim to share reference materials and setup tips to help readers build or improve similar environments.

  • Features of the technical setup

    • The setup utilizes multiple operating systems, experimenting with various applications across each OS.
  • Expectations from readers

    • I would greatly appreciate feedback and suggestions for improving the technical setup.

2. Hardware Configuration

1) Introduction to Equipment

EquipmentModelcpuGPUMemoryStorageOS
mac minimac mini M1, 20204 core Apple Firestorm 3.2 GHz + 4 core Apple Icestorm 2.06 GHz8 Core Apple G13G 1.28 GHz + NPU : 16 Core 4 gen Neural Engine16GBssd:256G + ssd(nvme) : 500G+ hdd: 1TSequoia - 15.1.1(24B91)
raspberry 3Raspberry Pi 3 Model B Rev 1.21.2 GHz ARM Cortex-A53Broadcom VideoCore IV MP2 400 MHz1 GB LPDDR2-450 SDRAMsd : 64GDebian GNU/Linux 12 (bookworm) - Linux raspberrypi 6.6.31+rpt-rpi-v8
raspberry 4Raspberry Pi 4 Model B Rev 1.11.8 GHz ARM Cortex-A72 MP4 on BCM2711 SoCBroadcom VideoCore VI MP2 500 MHz on BCM2711 SoC4 GB LPDDR4-3200 SDRAMsd :64GUbuntu 20.04.3 LTS
ordroidODROID-HC4Quad-core Cortex-A55ARM G331 MP2 GPU16Gsd : 32G + hdd : 6TUbuntu 20.04.6 LTS - Linux odroid 4.9.337-92
window notebookHP Pavilion - 14-al150txi7-7500U (2.7GHz) - 2 Core940MXDDR4 memory (4GiB)ssd : 256G + hdd : 1Tssd : 256G + hdd : 1TWindows 10

2) Hardware Layout and Connections

  • Router: Xiaomi AX1800

    • Wired Connections: Windows, Raspberry Pi 4, Raspberry Pi 3, Odroid

    • Wireless Connections: Mac Mini

3) Power connection

3. Software

  • 1) Applications

    • Installed on All PC:

      • Docker, cAdvisor, Portainer Agent (except macOS: runs as host), rclone
    • Installed on Specific PC:

      • macOS: Airflow, Portainer

      • Raspberry Pi 4: Cronicle, Grafana, Ghost, Redis (+ Exporter)

      • Raspberry Pi 3: Vault

      • Odroid: Prometheus, PostgreSQL (+ Exporter), MariaDB (+ Exporter), MongoDB (+ Exporter)

      • Windows 10:

        • WSL

        • WCS: windows_exporter (manually run, not set as a service)

  • 2) Development Environment

    • VSCode:

      • Local VSCode + GitHub

      • Installed code CLI to enable remote access via vscode.dev

    • Project Management:

      • Managed using Obsidian

4. Network & Monitoring

  • 1) Basic Network Structure

    • Connected to the Xiaomi router with IP range 192.168.0.XXX → Server Network

    • Connected to the IPTime router with IP range 192.168.1.XXX → Network for devices like TVs

  • 2) Monitoring and Management

    • Monitoring with Prometheus + Grafana

      • Uses cAdvisor and various types of Exporters

      • Currently focused on monitoring (data collection) without setting up alarms

5. Data Management and Backup

  • Data Storage and Backup Approach

    • Currently, no dedicated backup system is in place.

    • Data is stored on the HDD (or SSD) of each server.

    • A 6TB HDD is available on the Odroid, which is currently being used as storage.

6. Use Cases

  • Current Server Use Cases

    • Recording EBS with Airflow

      • Airflow triggers daily → Records EBS (on Raspberry Pi 4) → Uploads to Dropbox → Listened to via mobile device
    • Ghost blog installed on Raspberry Pi 4

      • Blog is set up, but no posts have been published yet
    • Backing up docker-compose.yml to Google Drive using rclone

7. Current Issues

  • Despite having various applications, their utility feels underwhelming.

    • Reason: Many setups were done out of curiosity ("just to try"), resulting in a lack of detailed or deep progress .

    • Application-Related Needs:

      • Prometheus:

        • Need to set up notifications for when exporters go down.
      • Grafana:

        • Currently using standard dashboards for each exporter, which are not very visually striking.

        • Viewing dashboards through the web page is inconvenient as they are not immediately noticeable → Would prefer to integrate with Slack for alerts.

        • Some exporter data doesn’t match the expected values → Requires testing and validation.

      • Airflow:

        • Using the default Docker Compose setup, but DAGs sometimes don’t appear in the web UI, possibly due to high resource consumption.

        • Older database values from a previous version (unspecified) seem to be causing reference DAGs to persist and not be deleted automatically.

          • Manually deleting the reference DAGs seems to work, but is not disappear(show again).

          • I deleted all reference DAGs with airflow db init and reference flag false

        • Logs generated by Airflow accumulate excessively → Plan to create a DAG to manage log cleanup.

      • Docker:

        • Other Docker setups are fine, but Docker on the Mac consumes an excessive amount of space.

          • The system on the Mac takes up 150GB, and it’s unclear where this usage originates.
      • rclone:

        • Currently refreshing Google Drive API tokens manually once a week.

          • This is likely because it’s a test account. Moving to production requires meeting strict personal information verification requirements, which has been avoided so far.

          • Although refreshing the token weekly is not a major burden, it is quite annoying.

  • Hardware-Related Issues

    • The SD card of the Raspberry Pi 3 failed, causing it to not boot.

    • Since most of the servers I have are not in perfect condition, this issue is likely to occur again at any time.

    • Based on this experience, I use docker-compose.yml to install and run most applications.

      • This allows the applications to be controlled via the YAML file, and by simply backing up this file (to Google Drive), reinstalling the applications later becomes much easier.

8. Conclusion

  • This article organizes the technical environment currently being experimentally operated at home.

  • I plan to update the content as I improve current issues or apply new ideas.

  • I am considering sharing the settings for each application and the docker-compose file, excluding some information.

And this article will be updated