Better Programming

Advice for programmers.

Follow publication

Build a Personal “Go To” Link Service With TypeScript and GitHub

Look up GitHub faster with just gh

Jackson MZ
Better Programming
Published in
5 min readMay 12, 2021

--

GitHub logo and search bar
Photo by the author.

As a developer, I spend a lot of time jumping between GitHub repositories, open source projects, and Stack Overflow questions.

It’s frustrating to start every new tab by searching on Google, looking through bookmarks, or typing the full URL.

In companies, developers commonly use some form of a “go to” link service internally to easily navigate. For example, developers will define gh to represent GitHub or q for Stack Overflow.

Similar services can be handy outside work as well.

There are a few solutions, but they are flexible enough for my use cases.

For example, golink is only enterprise-facing and trotto requires hosting a server. Most importantly, they all store the link mappings in a proprietary server that is not portable or customizable.

As a developer, if the current solution is not enough, it’s time to build one.

“Go To” Link Service = Client + Mapping + Translator

A “Go To” link service isn’t complicated. This is all we need:

  • A gateway client that is easy enough to access (e.g. a website saved as a bookmark or a custom search engine).
  • Storage for the URL translation and fetcher (e.g. from gh to https://github.com). It can be as simple as a YAML file stored on GitHub or even a gist snippet.
  • Translation logic.

Here are how the components mentioned above make a “Go To” link system:

Diagram of a “Go To” link system
Overview of a “Go To” link system

Building With GitHub Goodies = Simple + Open

Over the last few years, GitHub has made huge progress by adding tools to make it an all-in-one solution.

Building projects with GitHub technologies reduces the developers’ maintenance efforts and opens possibilities for creative extensions.

--

--

Jackson MZ
Jackson MZ

Written by Jackson MZ

Research Engineer (LLM Inference)

No responses yet

Write a response