SEMC

At SEMC we love using open source methods and research ideas to improve software quality and reliability in space 🌌. We are big on composable software: functional, typed, immutable. A part of the Nexus Aurora team, we develop a variety of software, from systems-level robotics to high-level configuration languages.

We’re open to contributors, so if you want to shape the future of the software industry working with innovative teammates, join us!

Resources
Github

Recent Discord Chat

#semc-chat
/home/bobbbay April 16, 2022 @ 11:15pm
Bump 🙂
/home/bobbbay April 16, 2022 @ 12:02am
That, and also https://github.com/semc-labs/Salo/blob/main/examples/syntax.sl (Thanks @ sudo)
sudo killall windows April 15, 2022 @ 2:05pm
https://semc-labs.github.io/Salo/
alone_coder April 15, 2022 @ 10:04am
@​Bobbbay (SEMC, MSO, RS, HaB, 🖖) I couldn't find any example of Salo code. I once created a language that was a mix between Lisp and Forth, named Listh :)
/home/bobbbay April 15, 2022 @ 12:58am
@​alone_coder For Salo, the language must have flexible libraries in parsing and networking, as well as memory-efficient. All I need to see is two popular libraries for these, and a reason that it's better than the current one.
alone_coder April 14, 2022 @ 4:37am
what are the requirements for a new language?
rough93 April 13, 2022 @ 2:11am
That's fantastic, let me see what I can do, bump me if I don't get you an update by Friday
/home/bobbbay April 12, 2022 @ 11:57pm
Does that help?
/home/bobbbay April 12, 2022 @ 11:57pm
Salo is currently written in Haskell, but that's not written in stone. Suitware is written in Rust, and I would like to keep it that way.
/home/bobbbay April 12, 2022 @ 11:56pm
For Salo, I'm looking for Programming Language theorists that can hammer out the language standard, and CEs that can implement this standard. I would also like CEs experienced in networking (especially with huge amounts of data), because that's the other half that Salo covers. For Suitware, I don't need any CS theorists specifically, but rather CEs that can implement the spacesuit functionality in a ROS-like paradigm.
rough93 April 12, 2022 @ 12:50pm
What kind of CS/CE skills are you looking for?
Tech_And_Sci April 12, 2022 @ 4:59am
I would help but my CS skills are not yet up to par to be able to effectively contribute
/home/bobbbay April 11, 2022 @ 11:42pm
Well, if I knew, I would tell you 🙂. I can, however, tell you what we *have* done in the past. Marketing on NA social media (thanks Sudo) didn't get any contributors, and I imagine it's because NA followers are moreso engineers and physicists than CS specialists. Marketing on Reddit, if I remember correctly, gained us ~5 new members, but none of which were experienced enough and/or had the time to commit to the project. It also took me a lot of time to prepare material/explanation for this. Long story short, marketing on channels that are CS-specific provided results, but not permanent ones, and takes a lot of time to prepare. In reality, I'm not a marketer, so I don't know how you can help further, sorry! #marketing is probably more knowledgeable than I am.
rough93 April 11, 2022 @ 8:51pm
Not at all, the project has a lot of promise and shows progress as possible. How do you think we can best support you getting more members?
/home/bobbbay April 11, 2022 @ 8:44pm
I'm also o.k. with considering this project as stale until further notice. Not my favourite solution, but if it's what you have in mind, I won't say no.
/home/bobbbay April 11, 2022 @ 8:44pm
I still really believe that the software we're writing is great shit, I just don't have the time to spend to write it all. Giving guidance is another thing, but doing it all from the ground up consumes time that I don't have.
/home/bobbbay April 11, 2022 @ 8:44pm
Hi Cameron, there hasn't been much progress lately because of exams and work. Unfortunately, I wasn't successful in finding more active members for the team, so SEMC's activity is linearly correlated with my free time (r=+1 🙂).
rough93 April 11, 2022 @ 5:31pm
Hey @​Bobbbay (SEMC, MSO, RS, HaB, 🖖)! How's progress?
/home/bobbbay March 11, 2022 @ 4:19pm
Thanks to anyone that takes a look!
/home/bobbbay March 11, 2022 @ 4:18pm
Well, if anyone would like to read the published docs, that would help. Anything that's under "What is Salo" can be considered the good copy. The rest of the docs are an unpublished draft that I have locally. The docs can be found at: https://semc-labs.github.io/Salo
rough93 March 11, 2022 @ 2:35pm
Documentation is always a vital part of project progress, anything you need a second set of eyes on?
/home/bobbbay March 11, 2022 @ 2:33pm
Not much. I've progressed a little with the designs, and written more unpublished documentation for Salo. Nothing of substance, basically.
rough93 March 11, 2022 @ 2:26pm
Hey! How's progress been this week?
/home/bobbbay March 4, 2022 @ 7:12pm
Thank you!
bigred March 3, 2022 @ 7:08am
that is a very coolly written "about".... 👍 to whomever wrote it 🙂
Betamaxx March 2, 2022 @ 10:17pm
http://stage.nexusaurora.org/projects/semc/
rough93 March 2, 2022 @ 12:32pm
Thanks! @​Betamaxx (James)
/home/bobbbay March 2, 2022 @ 4:02am
@​Cameron (SSAM) https://docs.google.com/document/d/1t1Jiu3v2ZSIGmyNJ6FHK0WEH7H1Pr_ATe-CQ-pzBwFI/edit?usp=drivesdk
rough93 March 1, 2022 @ 3:43pm
Hey @​Bobbbay (SEMC, MSO, RS, HaB, 🖖)! For the site update, could you provide a doc with this info? https://discord.com/channels/731855215816343592/812721892406460447/947884571796574258
/home/bobbbay February 14, 2022 @ 5:36pm
As a matter of fact, if there are any designers in NA, I would really rather they take a stab at it. I also have a sketch of the HUD that needs digitizing and editing.
/home/bobbbay February 14, 2022 @ 5:35pm
@​bigred I appreciate all of your points. I'm just a computer theorist, so that helps for sure. I'll incorporate this into a new version and let you know, cheers! (And, in case I didn't stress it enough, thanks to you too, Sam).
bigred February 13, 2022 @ 8:55am
. the traditional view of how we go about space is to have 100 people and 100 computers control everything for 3 people in space. this view distracts from what is the real purpose of actually living in space. I feel that it would be better to determine what it is the suit is for and concentrate on designing for that rather than for designing for control systems. The control systems will come as natural part of the development process when you are thinking about "what can i do to help me do this specific job/s in space" or "what tasks do i need to do to survive in space and why do i want to do it". If it is (as in your example) to fix something on the ISS then think about what a normal maintenance engineer would naturally do and need for fixing and engineering or maintenance problem on earth. then factor in the tools and abilities they need and design your suit for that including the environmental problems. This would appear to show that all the suit diags etc are non-essential and distracting data. environmental and physical situation data are what the person needs to do the task and fix the problem. Just some thoughts. You may have already had these in mind 🙂
bigred February 13, 2022 @ 8:46am
Im not sure if too much permanent data display is a good idea. It is necessary to be able to view them when needed and it is definitely necessary to be able to have items highlighted in a very visible manner when "out of bounds". however I feel it might be good to bear in mind the following: - the environment being worked in is very hostile, volatile and dangerous to make wrong moves in. It is very prone to people making mistakes due to its nature. All sorts of mistakes even like missing a handhold, not seeing something coming and orientation. - by any suit's necessary design the field of view is limited. Within the environment with problems such as above it is absolutely necessary for a user to be exceptionally vigilant and have the best field of view possible. Items that restrict or distract from this already narrowed and isolated field of view could cause problems. - The suit and environment isolate the user from most of the senses we utilise for data input to determine how to react within our environment. therefore it might be best to focus on ability to display items which can enhance/replace the lack of those additional sensors. eg. as a user cannot move freely to rotate and view all surroundings perhaps some method of viewing behind and peripheral vision.
/home/bobbbay February 12, 2022 @ 3:36pm
Yes, I like that, thank you! Regarding the "focus": I imagine someone putting on the suit, and going out to fix something on the ISS (for example). They might want a second set of eyes on hand that can also keep an eye on their vitals. So it's really a mix of both diagnostics and monitoring, Does that seem right?
Ross (/u/_albertross) February 12, 2022 @ 3:29pm
- Non-critical data feeds (those which are within bounds) are smaller, lighter colours, maybe even semi-transparent to reduce visual clutter on the screen. Can't do that with Onenote but hopefully the idea carries - Very clear colour/design change when a variable goes out of bounds - Centre of the screen is suit diagnostic, gives a one-glance overview of if everything is fine
Ross (/u/_albertross) February 12, 2022 @ 3:27pm
If it's the former I'm imagining something like this
Ross (/u/_albertross) February 12, 2022 @ 3:27pm
Depends what the focus is - suit diagnostics or activity monitoring
/home/bobbbay February 12, 2022 @ 3:23pm
Which component do you think should be in the center of the design?
/home/bobbbay February 12, 2022 @ 3:23pm
Right, to be honest, I'm not really sure *where* the eye should be drawn. I imagine the livestream should be #1. Hopefully with an actual stream it will be more appealing (?).
Ross (/u/_albertross) February 12, 2022 @ 3:20pm
Let me sketch something quickly
Ross (/u/_albertross) February 12, 2022 @ 3:19pm
The eye isn't really drawn anywhere
Ross (/u/_albertross) February 12, 2022 @ 3:18pm
It's very data-rich but I'm finding it very hard to see what I should really be looking at (obviously, because the data is generic)
/home/bobbbay February 12, 2022 @ 3:17pm
/home/bobbbay February 12, 2022 @ 3:16pm
I am not a designer, open to hear suggestions.
/home/bobbbay February 12, 2022 @ 3:16pm
Initial Suitware dashboard design:
codeflight February 8, 2022 @ 2:26am
ah, ok
/home/bobbbay February 8, 2022 @ 2:24am
It will once I publish Canalise, a TCP framework for Aparan.
codeflight February 8, 2022 @ 2:23am
Looks cool, but the description is inaccurate. A main component of the ROS message is is cross-process messages, which this doesn't support. (I know this because I'm building a cross-process bus in rust)
/home/bobbbay February 8, 2022 @ 1:19am
Hi all, unfortunately, I haven't had much time to dedicate to SEMC this past week. This being said, I've reworked the Salo book, and I hope to work on that in the coming weeks. I also worked on the design for Suitware's design, but I'm not ready to release that yet!
/home/bobbbay January 30, 2022 @ 7:41pm
Hi all, this week not a lot has been going on. I have mostly been working more on design specifics. I separated Suitware and Aparan (https://github.com/semc-labs/aparan) and redid documentation for Salo.
/home/bobbbay January 25, 2022 @ 4:42pm
A few months ago I had put together a diagram for Salo's deploy workflow should look like. It's now updated to the most recent version:
/home/bobbbay January 24, 2022 @ 3:13am
It's possible: Haskell has WASM support with Asterius by Tweag. Furthermore, in theory, one could simply not use WASM and create a web application. All in all, Salo is *just* a library, and any form of client can be created for it. As a matter of fact, if you check the source, you'll find an `app/` directory. This hosts client-specific content that is currently a command-line argument wrapper calling the Salo library from Cabal. It's simple to create a new executable named `Web` (or something similar), make Cabal compile it with Asterius, and call Salo from there. That's the magic of compilation 🙃.
sudo killall windows January 23, 2022 @ 8:59pm
sudo killall windows January 23, 2022 @ 8:51pm
I wonder if we could set up a live demo for salo with wasm
/home/bobbbay January 23, 2022 @ 8:49pm
Hi all, another summary for the week: I've had great progress in regards to Salo's lexer (in Haskell). It can happily lex all of Salo's defined grammar. Here's a small demo: ``` [email protected]:~/projects/salo$ stack run -- repl Salo λ> 1 + 1 [L (AlexPn 0 1 1) (LInt 1) (Just "1"),L (AlexPn 2 1 3) LPlus (Just "+"),L (AlexPn 4 1 5) (LInt 1) (Just "1"),L (AlexPn 0 0 0) LEOF Nothing] Salo λ> module x [L (AlexPn 0 1 1) (LIdentifier "module x") (Just "module x"),L (AlexPn 0 0 0) LEOF Nothing] Salo λ> true [L (AlexPn 0 1 1) (LBool True) (Just "true"),L (AlexPn 0 0 0) LEOF Nothing] Salo λ> " E: String not closed at end of file ``` Of course, much more is supported... See https://github.com/semc-labs/Salo/blob/main/examples/syntax.sl. That's all for this week!
rough93 January 17, 2022 @ 5:04pm
Sounds good @​Bobbbay (SEMC, MSO, RS, HaB, 🖖), enjoying the updates and the software discussion it's starting
/home/bobbbay January 17, 2022 @ 4:50pm
Hi all, a summary of what had been going on this past week: I finished a rudimentary type system for Salo. The in-progress paper can be found in #💻software. I have also started the project in Haskell. I plan to have a full parser by next week, so I can start working on the evaluator. Cheers!
/home/bobbbay January 9, 2022 @ 8:05pm
Hi everyone, here is what has been happening this past week in SEMC: * I have been working on Aparan, Suitware's to-be internal scheduling framework. It has been going great! There's full support for asynchronous channels and workers. I want to be focusing less on Suitware and more on Salo this month.
/home/bobbbay January 2, 2022 @ 7:55pm
Hello everyone, here's a weekly update: * I've started working on a distributed, asynchronous framework to base Suitware upon. I'm surprised by how easy it has been to write it, so far. * I started the developing the UI. It's not my strong spot, but using Sixtyfps has been a pleasure so far. There's a basic layout, and a functioning temperature graph as a proof of concept. The layout is not set in stone, so I'd appreciate any ideas on how a remote-access application for the suit might look. TL;DR happy advances on the technical side. Cheers!
/home/bobbbay January 2, 2022 @ 3:47pm
This UI is the representation of the program under-the-hood, if that helps :).
Szymek (Simon) Matkowski January 2, 2022 @ 3:44pm
or I confuse UI definition?
Szymek (Simon) Matkowski January 2, 2022 @ 3:44pm
UI is the representation of the real user - astronaut? or the programmer?
/home/bobbbay January 2, 2022 @ 3:43pm
@​Szymek (Simon) Matkowski thank you for the feedback! Unfortunately, I'm not talented at engineering audio, so I will have to ask someone else down the line to incorporate your feedback into the synthesis. Regarding point no. 3, what UI mockup is this about? The architecture image is for internal usage only, doesn't serve as a UI. Sorry for the confusion.
Szymek (Simon) Matkowski January 2, 2022 @ 3:34pm
3. UI mockup requires list of main features , constant and temporary ones that are to be displayed. it is same task like design of WWW
Szymek (Simon) Matkowski January 2, 2022 @ 3:34pm
hi @​Bobbbay (SEMC, MSO, RS, HaB, 🖖) 1. sound definitely should be 3D one changing as the astronaut moves. 2, i wonder how many different signals can be accepted by human being. if you find yourself in number of objects around you this may get quie messy. I would say that not the velocity but momentum is more critical.
/home/bobbbay December 26, 2021 @ 5:05pm
Synthesis examples:
/home/bobbbay December 26, 2021 @ 4:46pm
Architecture doc image:
/home/bobbbay December 26, 2021 @ 4:45pm
Hello everyone, it's the same day of the week again! This past week, I've gathered a few visual aids to make this process more interesting. Here are the big new features for Suitware from this week: * Added simulation support * Added sound synthesis * Added opentelemetry support for logging Here are the "suitware-synthesis" examples (thanks @​Sam (FARMM)). The idea is that the sound of the vehicle changes depending on its state. Currently, as a proof-of-concept, the sound depends only on distance and vehicle velocity. The state is written in the filenames of the following audio files. Furthermore, I wrote an architecture chart for documentation. There is, imaginably, not much documentation on Suitware, and I would like to have some helping hands, so these are some first steps. I've attached the architecture image in case it's of interest. (Can also be found on ). On the same vein, I wrote some contributing documentation under `/docs`. Finally, I now have a public TODO list on GitHub: . See you next week!
Ross (/u/_albertross) December 19, 2021 @ 11:11pm
But anyone who's worked in a noisy and information-rich environment like a construction site can testify that you learn what sounds mean very quickly
Ross (/u/_albertross) December 19, 2021 @ 11:11pm
Yep it wouldn't have to necessarily be realistic sounding, that's a matter for the audio artists 😛
/home/bobbbay December 19, 2021 @ 11:05pm
Ah, I see. Yes, that's certainly possible, but I'm not an expert in audio synthesis so I can't promise the most realistic sounds ;) In all seriousness, sounds like a great addition.
Ross (/u/_albertross) December 19, 2021 @ 9:25pm
If you have a way of solidly mapping state vector to sound, you're set. For a vehicle it would be something like ``` position -> location of sound in stereo audio absolute distance from suit's head -> volume absolute velocity -> pitch of tone status of engine -> timbre (running at full power sounds different to coasting) rate of change of absolute distance from suit -> doppler shift ```
Ross (/u/_albertross) December 19, 2021 @ 9:22pm
For a vehicle that's obviously position/velocity but for industrial equipment it might be pressure/flow/power/voltage
Ross (/u/_albertross) December 19, 2021 @ 9:22pm
Sorry, my confusion with rocketry terms - you can use state vector to describe the state of a piece of equipment from any arbitrary set of operating conditions
/home/bobbbay December 19, 2021 @ 2:15pm
Regarding question 1, there is one possible slightly elegant solution. As long as the spacesuit knows the zero, say somewhere on mission control on Mars, it can relay that information on-handshake with the device. So, imagine if device A has come into range of the spacesuit. It sends a signal to the suit, asking to connect. The spacesuit would respond with two pieces of information: its own position relative to zero (the position vector `r` in a state vector), as well as the new device's position relative to the suit. With these two pieces of information, the device can put together its state vector for *right now*, allowing it to figure out its state vectors for the future. The only issue is finding `r` for the suit and the device.
/home/bobbbay December 19, 2021 @ 2:10pm
@​Sam (FARMM) That sounds like a great idea. Does shipbreaker have that? Here are a few initial questions that arise: * Comparing state vectors is fine, but where is the "0" of each item? It's possible for it to be the suit's head itself, but considering that it moves so often, that's not so much of a great idea. * This means that every piece of equipment will have to have some sort of gyroscope, in order to know its own state vector at all times, and some sort of networking tool to communicate with the suit. Not necessarily impossible, just something to keep in mind, especially for smaller devices that, by nature, don't need such a tool. Barring that, the rest is 100% possible. We could even encode the state vector of the suit into the gurgling sounds, for some fun effect :).
Ross (/u/_albertross) December 19, 2021 @ 12:36am
You could even have intrinsic warnings for hazards in the environments. Adding a gurgling to liquid-filled pipes (based on pressure or flow rate sensors), or a low hum to powered-on electrical equipment, gives you another information layer
Ross (/u/_albertross) December 19, 2021 @ 12:35am
For example, when you're working on the surface there's a need to be aware of where vehicles are relative to you and something about them. You could synthesise an engine noise with a varying pitch for speed (just like a real transmission system) and feed it into each headphone to let the user know where every vehicle in the environment is, and roughly what it's doing, completely interface-free
Ross (/u/_albertross) December 19, 2021 @ 12:33am
But if we've got a big beautiful networked system that's tracking the state vector of every piece of equipment in the locale, as well as the position relative to the suit's head, we can generate real-time audio based on that data and pipe it in using a binaural system to give it a sense of position and a "status feed"
Ross (/u/_albertross) December 19, 2021 @ 12:32am
Hear me out (pun intended). Lots of the time you're operating in a spacesuit, you need operational awareness of a lot of things in your immediate environment as well as a deep focus on the ones immediately around you. In a shirtsleeves environment that's easy, you can listen around you for what's happening and be alerted very quickly to even a very slight change in the environment. In space, even on Mars with a thin atmosphere, you don't have any substantial transmission of sound waves apart from through solid objects. Hence no awareness
Ross (/u/_albertross) December 19, 2021 @ 12:30am
Hey @​Bobbbay (SEMC, MSO, RS, HaB, 🖖) I was just playing some Shipbreaker this evening (excellent game) and it gave me an idea that I think might be worth discussing 3D audio feedback systems for spacesuits
/home/bobbbay December 19, 2021 @ 12:10am
Hey everyone, it's that time of the week again. I haven't had much time to work on projects due to finals. In whatever time I have had, however, I've successfully experimented in simulating spacesuit behaviour via the HAL. I have some great uncommitted changes on my laptop regarding this! Cheers, until next week! 🤝
bigred December 13, 2021 @ 12:14pm
give it time.... that is great initiative 👍
/home/bobbbay December 12, 2021 @ 11:53pm
Monitoring the spacesuit remotely, e.g. from mission control or from spacecraft. RAA has the same information about this suit (sensors, etc.) as the HUD has, they're just displayed in different places and in different ways.
rough93 December 12, 2021 @ 6:05pm
what is the use-case for RAA?
/home/bobbbay December 12, 2021 @ 4:08pm
Hello everyone! Here's a summary of SEMC progress this past week. * I have spent most of my time this week on Suitware. I'm pretty excited for the project, as I established a bidirectional stream with gRPC to a client this week. I also created two separate client apps - the HUD and the remote access application (RAA). The former is for use by the spacesuit user themselves, and the latter for mission control at home base or anywhere else. I've also been looking into https://github.com/rust-vmm this week, because I believe it can be used to simulate a real spacesuit (sensors and all) virtually. * Not much for Salo this week. Cheers!
sudo killall windows December 8, 2021 @ 1:28pm
For the GitHub picture?
Szymek (Simon) Matkowski December 8, 2021 @ 4:04am
Maybe try to use other image?
Szymek (Simon) Matkowski December 8, 2021 @ 4:03am
sudo killall windows December 7, 2021 @ 2:54am
It'll get more exposure over the next few hours ig
sudo killall windows December 7, 2021 @ 2:53am
https://twitter.com/NexusAurora/status/1468046765115023361 tried to get some new people in but doesn't seem to be going well
sudo killall windows December 7, 2021 @ 12:24am
I more meant just a mock-up for PR
/home/bobbbay December 6, 2021 @ 11:59pm
It would certainly be interesting to see a HUD done in Figma, but I doubt the software has the capabilities.
sudo killall windows December 6, 2021 @ 10:46pm
If you can get a demo of it in figma or similar that we can put on the web that would definitely kick start interest in SEMC
/home/bobbbay December 6, 2021 @ 10:45pm
That's awesome, will take a look. Certainly not my area of expertise, but I think I've seen similar technology on cars. Thanks!
/home/bobbbay December 6, 2021 @ 10:42pm
Yes! I would like to create a screen recording as a tour of the software once there is more user-facing content.
Ross (/u/_albertross) December 6, 2021 @ 6:54pm
https://www.youtube.com/watch?v=Z2o_Sp2-aBo Very clever bit of technology, plus this guy has already completely designed the driver board
Ross (/u/_albertross) December 6, 2021 @ 6:52pm
@​Bobbbay (SEMC, MSO, RS, HaB, 🖖) Depending on how much detail you want on the HUD (just indicator lights vs a full LCD-type display), electroluminescent displays might be the way to go. No additional projection hardware required, can be made conformal onto curved displays, relatively easy to drive
rough93 December 6, 2021 @ 6:46pm
Scheduling and a HUD are interesting components to the new suitware, any plans to show off some demos once you get further in?
/home/bobbbay December 6, 2021 @ 6:40pm
Hello everyone! Here's a summary of SEMC progress this past week. * A new (small experimental) project was started: Suitware. As the name implies, this project has the goal of being a proof-of-concept for spacesuit software. There are many interesting components to the project - asynchronous scheduling, a HUD, and a scalable hardware abstraction layer. If the EVO project starts again, Suitware should be a drop-in solution for the software on that front. * Not much for Salo this week. Cheers. https://github.com/semc-labs/suitware