More Scrapli Related Updates
This post is a tidied up version of a post I made in the scrapli channel on the networktocode slack community. If you want to be as up to speed as possible on scrapli related things (and really, who doesn’t want that??), you should check out the
NetDev community here or the
networktocode community here and pop into the
scrapli channels there!
It’s been a little bit since i made any updates… but I’ve been quite busy!
You may have seen the “ginormous” update releases of scrapli, scrapli-netconf and nornir-scrapli… that was a serious amount of work, here is a quick recap of what this was all about:
- It was all aimed primarily at cleaning up bad ideas/bad code that I had made before/improving things with lessons learned over the last year or so, etc..
- There was also a very significant overhaul to documentation… before we had a giant README and some auto generated api docs. Now, we have a modern mkdocs setup with concise READMEs. We still have the auto generated api docs, but they have been stuffed into the mkdocs pages nicely (in the api docs section unsurprisingly!), you can check out the updated docs for scrapli core here: https://carlmontanari.github.io/scrapli/
- All the transports that currently existed now live inside of scrapli “core” – they are still optional extras, and scrapli core still has zero requirements if you don’t need/want the other transports.
- scrapli core (in particular) unit testing has been redone to be much better in general.
You can read a blog post i wrote during the early phase of the “ginormous overhaul” here
I’ve also thought a lot about versioning, and how best to do this. Rather than write a bunch more here, you can read my thoughts/the new versioning strategy for scrapli here
If that wasn’t enough, I seem to enjoy punishing myself, and to that end there are two new members of the scrapli family…. Note that these are still in alpha/beta type state, so use with caution and please please let me know if you have any thoughts/improvements/suggestions/hate mail/etc!
scrapli replay is all about enabling easy testing of scrapli programs. There are two main components, the first is a pytest plugin that behaves very much like vcr.py/pytest-vcr (if you are unfamiliar w/ these, very much recommend checking them out), the second is a “collector” and a “server” that allow you to collect (kind of record) interactions with an ssh server (router/switch, it must be a “network device” or at least use a scrapli platform based on the
NetworkDriver) and then replay that connection in a “semi interactive” fashion.
The pytest plugin should be helpful for folks that want to write tests for code that contains scrapli connections (works for netconf too!) but don’t have a mock device (or a real device) that they can connect to during test runs in their CI system.
The collector/server is sorta interesting and can be used to build a more “real life” server to use inside of CI (this is already being used in scrapli CI in the new “integration tests” section, more work to be done here). This is a “show” command only type of recording, meaning that we don’t capture config state at all and wont keep track of that sort of thing (check the docs for more details, don’t want to turn this into more of a wall of text than it already is).
Check out the docs (that probably still need more work) for some more info, or the repository.
Like NAPALM in a lot of ways, but strictly via CLI – no eAPI, no eznc, no pyiosxr, and no getters. Just config management via Telnet/SSH… thats right Telnet. You can do full config replace via console servers (w/ a bit of work to be fair, but its pretty much all ready for it), or just over normal SSH without the need to enable eAPI or have NETCONF ports open etc..
There is a bit more to this w/ some templating/automagic config replacement stuff in the works but that is still fairly nascent at the moment. I also need to get this working on JunOS – if anyone is savvy w/ JunOS and wouldn’t mind helping out please DM me!
You can find the repository here, and the docs here (still more work to do on the docs probably!).