Today I had a little aha moment. If anyone asked me yesterday about AI tools integrated into their editor, I would say its a bad idea. Ask me today, I would still say its bad idea. :D Because I don’t want to rely on AI tools and get too comfortable with it. Especially if they are from big companies and communicate through internet. This is a nogo to me.
But since weeks I am playing around with offline AI tools and models I can download and execute locally on my low end gaming PC. Mostly for playing with silly questions and such. It’s not integrated in any other software, other than the dedicated application: GPT4All (no it has nothing to do with ChatGPT)
I’m working on a small GUI application in Rust and still figure out stuff. I’m not good at it and there was a point where I had to convert a function into an async variant. After researching and trying stuff, reading documentation I could not solve it. Then I asked the AI. While the output was not functioning out of the box, it helped me finding the right puzzle peaces. To be honest I don’t understand everything yet and I know this is bad. It would be really bad if this was a work for a company, but its a learning project.
Anyone else not liking AI, but taking help from it? I am still absolutely against integrated AI tools that also require an online connection to the servers of companies. Edit: Here the before and after (BTW the code block in beehaw is broken, as certain characters are automatically translated into <
and &
for lower than and ampersand characters respectively.)
From:
pub fn collect(&self, max_depth: u8, ext: Option<&str>) -> Files {
let mut files = Files::new(&self.dir);
for entry in WalkDir::new(&self.dir).max_depth(max_depth.into()) {
let Ok(entry) = entry else { continue };
let path = PathBuf::from(entry.path().display().to_string());
if ext.is_none() || path.extension().unwrap_or_default() == ext.unwrap() {
files.paths.push(path);
}
}
files.paths.sort_by_key(|a| a.name_as_string());
files
}
To:
pub async fn collect(&self, max_depth: u8, ext: Option<&str>) -> Result {
let mut files = Files::new(&self.dir);
let walkdir = WalkDir::new(&self.dir);
let mut walker =
match tokio::task::spawn_blocking(move || -> Result {
Ok(walkdir)
})
.await
{
Ok(walker) => walker?,
Err(_) => return Err(anyhow::anyhow!("Failed to spawn blocking task")),
};
while let Some(entry) = walker.next().await {
match entry {
Ok(entry) if entry.path().is_file() => {
let path = PathBuf::from(entry.path().display().to_string());
if ext.is_none() || path.extension().unwrap_or_default() == ext.unwrap() {
files.paths.push(path);
}
}
_ => continue,
}
}
files.paths.sort_by_key(|a| a.name_as_string());
Ok(files)
}
AI is surprisingly helpful with providing a starting point. When you want a helloworld app, an example of how to use some part of a crate, or a code snippet showing how to take what you have and do something unusual with it, AI is super useful.
I would love to be able to get the same quality of AI locally as I do from ChatGPT. If that’s possible, please let me know. I’ve got two 3090s ready to go.
But for now, I’m just enjoying the fact that ChatGPT is free. Once they put up a pay wall, it’s back to suffering (or maybe/probably trying out some open-source models).
gemma2 27b IT is reasonably good for computer related things. It’s a hugging face gguf model which is compatible with koboldcpp which means multi gpu acceleration.
deleted by creator
I agree. Even when copilot gets something completely wrong it’s usually easier to think “no that’s wrong, it should be this” than “ok the first line of code should be…”.
It completely solves the “blank page” problem.