EmbeddedEntropy

joined 2 years ago
[–] [email protected] 11 points 2 years ago (2 children)

I’d rather have an M.2 connector without requiring a HAT.

I’ll stick with my Orange Pi 5 for now which comes with one, tyvm.

[–] [email protected] 4 points 2 years ago (3 children)

I've written hundreds (thousands?) of GNU Makefiles over the past 30 years and never had a need to unconditionally run particular targets before all others. GNU Make utility is a rule-based language. I'd suggest what you're attempting to do is enforce an imperative programming language model onto a rule-based programming language model, which you're going to run into trouble trying to code in a different language model than the tool's native model.

Can you provide what you mean by check the environment, and why you'd need to do that before anything else?

For example, in the past I've want to determine if and which particular command was installed, so I have near the top of my Makefile:

container_command_defaults = podman docker
container_command_default_paths := $(shell command -v $(container_command_defaults))

ifndef container_command
  container_command = $(firstword $(container_command_default_paths))
  ifeq ($(container_command),)
    $(error You must have docker or podman installed)
  endif
endif

Using the := operator with $(shell ...) is a way to run a command while GNU Make is initially parsing your Makefile. Normally, using := assignment operator is antithetical to a rule-based language, so you want to limit its use as much as possible, but unusual exceptions can exist.

I'm also unclear what you mean by "ensure variables are set". What kind of variables?

The above snippet shows how you can check if a makefile variable is set when the Makefile is first parsed, if not, declare an error and exit. (The same approach works for environment variables too.)

Preparing a particular layout ahead of time is not the best approach. I'd suggest a given layout is nothing more than dependencies that should be declared as such.

Also, running specific targets or rules unconditionally can lead to trouble later as your Makefile grows up. You may eventually have additional targets that say provide information about the build's state or run checks or tests. You wouldn't want those targets necessarily to go off and build an entire tree of directories for you or take other unnecessary actions.

If you want to ensure certain directories are present, add those as dependencies for those targets with the | character. For example:

build_directory ?= build
build_make = $(MAKE) ...
targets = ...

all: FORCE | $(build_directory)
	$(build_make) $(targets)

$(build_directory):
	mkdir -p -- '$@'

Even though I've been writing GNU Makefiles for decades, I still am learning new stuff constantly, so if someone has better, different ways, I'm certainly up for studying them.

[–] [email protected] 3 points 2 years ago

Part of the confusion I find is he’s trying to make a tech joke using something inherently non-technical, states’ names.

[–] [email protected] 12 points 2 years ago (3 children)

I think the joke would have been better and more understandable if it had used different corporate names rather than states. But, of course, that might have been legally problematic.

[–] [email protected] 1 points 2 years ago

I’ve done both. I wrote my own scripts to generate the WG config files to handle variations in configure I needed to make for my different networks (masking, IPv6, cross multiple WG networks).

After converting to Tailscale, WG is just an extra level of hassle I can now easily avoid.

[–] [email protected] 0 points 2 years ago (2 children)

Use Tailscale. Much easier to configure and manage than raw WireGuard.

[–] [email protected] 2 points 2 years ago (1 children)

Like IT gives you any time to get anything off a corporate-owned device.

When I got laid off, IT sent a bullet to my laptop immediately kicking me off and completely locking me out of it.

I was supposed to have another 4 days to transition my work. I contacted IT and was told once the bullet goes out, that’s it. Any and all access to everything has been terminated. Might as well just go home and enjoy the extra 4 days because no one’s going to undo a bullet going off early unless it comes from the C-suite. So I did.

[–] [email protected] 14 points 2 years ago* (last edited 2 years ago) (1 children)

Unless Gitlab changed things very recently, you only needed to provide a CC/DC if you wanted the free CI/CD pipeline enabled for your projects. Decline, and everything except the free pipeline works just fine.

[–] [email protected] 8 points 2 years ago

A 1979 TV show about a guy who put together a junk spaceship to salvage junk from the moon: Salvage 1.

My teenage self found it entertaining at the time. Hmmm, now where did I leave my parrot? I wonder if he could help me find a copy…

[–] [email protected] 1 points 2 years ago

All of them. Corp directive (now) is that hosts must be updated or reimaged every 90 days.

[–] [email protected] 13 points 2 years ago (1 children)

If they can have someone program a fee in their accounting systems, that means they know exactly what that the fee is and under what conditions it’s applicable. It’s trivial from there to sort, filter, and list them.

view more: ‹ prev next ›