In this guide, we’ll walk through setting up a local AI-powered coding assistant within Visual Studio Code (VS Code). By leveraging tools such as Ollama, CodeStral, and the Continue extension, we can enhance our development workflow, get intelligent code suggestions, and even automate parts of the coding process.
Ollama run
from your terminal.config.json
) and set the allow_telemetry
parameter to false
.config.json
. Add the following configuration:
{
"models": [
{
"provider": "ollama",
"name": "granite_8b",
"model": "granite_8b"
}
],
"allowTelemetry": false
}
Ctrl+L
, and interact with the AI. For instance, if you highlight a few lines of code and ask, “How can I improve this code?” the AI will analyze the snippet and suggest improvements.nvidia-smi -L
command to identify the unique ID of each card. You can then set the CUDA_VISIBLE_DEVICES
environment variable to ensure the AI model utilizes the right GPU for faster performance.Useful videos:
Outsourcing companies are hell to be in for an experienced programmer, because managers are being allowed to mistakes, which are covered, thus putting the rest of the workers in not favourable position.
So it is very important to keep track of your health and please do not try to compensate the toxic workplace effect with: letting it out to other people, overeating not sleeping etc.
Restorative is : running, and weight lifting.
From the herbs I recommend: thyme, glog and passiflora tea to help with the sleep as well as taking ashwagandga.
Take your time and enjoy!
1) Create the user model in models/UserModel.php
php artisan make:model User
2)
create validation for the update requests: php artisan make:request UserUpdateRequest
and for the post request:
php artisan make:request UserPostRequest
create user controller based on the user model: php artisan make:controller UserController --model=User --resource
3) create resources/UserCollection: php artisan make:resource UserCollection
to return user collection and user resource, when required by the user controller.
create UserResource: //expose which fields to be returned in the json response.
4) enable the requests to be performed, and add validation rules when posting and updating information, inside Requests/UserPostRequest.php
php artisan make:request UserPostRequest
for updating info:
6) add routes/api.php in order to redirect /users to the index() method of the UserController.
Cheers!
1) enable wildcard listening address of the app like 0.0.0.0
ss -anpst will show you the on which address/port the app is listening to.
2) use powershell to setup portproxy to forward all the outside requests to the windows machine to land in the WSL2 system:
netsh interface portproxy add v4tov4 listenport=3000 listenaddress=0.0.0.0 connectport=3000 connectaddress=localhost
listenport and listenaddress are on the Windows side.
connectport and connectaddress are on the WSL2 side.
(for a node app the listening port (connectport) is usually 3000, check you app listening port in 1)
verify with: netsh interface portproxy show all
3) open port 3000 on the firewall with:
netsh advfirewall firewall add rule name="WSL2 app" dir=in action=allow protocol=TCP localport=3000
verify from the windows defender firewall, advanced settings, inbound rules.
Cheers !
Instructions for MAC M1 instance:
1) Update your package.json with the latest tns-ios version!
2) run from /platforms directory: tns platform remove ios, tns platform install ios
3) tns prepare ios, and follow the settings for:
Xcode 12
build:
and development(emulator):
Keep in mind to change for build the VALID_ARCHS to x86_64, and
for developments to arm64 respectively.
for Xcode 13 build just change the VALID_ARCHS to:
10 Steps to install Laravel Sail and start developing web applications under WSL:
1. from Turn Windows features on and off:
choose Windows subsystem for Linux (WSL) -> and restart the system
2. update the kernel of WSL from https://wslstorestorage.blob.core.windows.net/wslblob/wsl_update_x64.msi
3. set the default version to 2: wsl --set-default-version 2
4. install from Microsoft Store: Ubuntu
open Command prompt, and type ubuntu
5. Update the ubuntu system:
sudo apt-update && sudo apt dist-upgrade -y
6. Setup Docker: Install Docker Desktop
Go to Settings(icon) then check: General->Use the WSL2 based engine, as well as
Resources->WSL INTEGRATION-> enable integration with my default WSL distro, check also Ubuntu and restart the Docker Desktop app.
7. run inside Ubuntu: curl -a https://laravel.build/example-app | bash
8. start the containers with: ./vendor/bin/sail up
9. you can browse: 127.0.0.1:80
10. in another terminal of Ubunu run: code .
so that you can edit your files inside Visual Studio Code.
Cheers!
Steps:
with lsusb we can first see if the device is recognised correctly.
then type: iwconfig then use the Tab key to get to your device name
then edit /etc/wpa_supplicant/wpa.conf
and place there:
network={
ssid="network_id",
psk="encoded_password"
}
(you need to supply your own network_id and encoded_password,
you can get the encoded_password by typing:
sudo wpa_passphrase your_ssid
then type a password
and you'll get sample config file with the encoded password, you can overwrite the original file with.
Next: start the wpa supplicant with:
suto wpa_supplicant -Dnext -iwxl...(wifi interface id) -cwpa.conf
Enjoy!
In order to connect Laravel with RabbitMQ we will need the following library:
composer require vladimir-yuldashev/laravel-queue-rabbitmq
then'connections' => [
// ...
'rabbitmq' => [
'driver' => 'rabbitmq',
'queue' => env('RABBITMQ_QUEUE', 'default'),
'connection' => PhpAmqpLib\Connection\AMQPLazyConnection::class,
'hosts' => [
[
'host' => env('RABBITMQ_HOST', '127.0.0.1'),
'port' => env('RABBITMQ_PORT', 5672),
'user' => env('RABBITMQ_USER', 'guest'),
'password' => env('RABBITMQ_PASSWORD', 'guest'),
'vhost' => env('RABBITMQ_VHOST', '/'),
],
],
'options' => [
'ssl_options' => [
'cafile' => env('RABBITMQ_SSL_CAFILE', null),
'local_cert' => env('RABBITMQ_SSL_LOCALCERT', null),
'local_key' => env('RABBITMQ_SSL_LOCALKEY', null),
'verify_peer' => env('RABBITMQ_SSL_VERIFY_PEER', true),
'passphrase' => env('RABBITMQ_SSL_PASSPHRASE', null),
],
'queue' => [
'job' => VladimirYuldashev\LaravelQueueRabbitMQ\Queue\Jobs\RabbitMQJob::class,
],
],
/*
* Set to "horizon" if you wish to use Laravel Horizon.
*/
'worker' => env('RABBITMQ_WORKER', 'default'),
],
// ...
],
then you need to edit the .env file, supplying your settings under the rabbitMQ section:RABBITMQ_HOST, RABBITMQ_PORT, RABBITMQ_USER, RABBITMQ_PASSWORD, RABBITMQ_VHOST
also for the QUEUE_CONNECTION you should supply: rabbitmq
Now lets create a job in the terminal with:
php artisan make:job TestJob
it will handle all the incoming queue events. It's contents under /jobs:private $data;
/**
* Create a new job instance.
*
* @return void
*/
public function __construct($data)
{
//
$this->data = $data;
}
/**
* Execute the job.
*
* @return void
*/
public function handle()
{
print_r($this->data);
}
Finally we connect and run the created above job handler in order to handle event. Inside EventServiceProvider.php$this->app->bind(
TestJob::class."@handle",
fn($job)=>{$job->handle()} // this will run the handle() function from above.
)
Then inside of a controller you can run:use App\Jobs\TestJob;
TestJob::Dispatch('hello');
you can see inside of the queue with: php artisan queue:work Here is how to install Angular Material on Ubuntu:
1. Install NODEJS/NPM
inside of a terminal type: sudo apt install nodejs
as an alternative you can use nvm:
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash
then just type: nvm install --lts
this will download install and use latest long-term supported version of node.
2. Install the angular CLI
with npm i -g @angular/cli
3. Create new project: ng new myproject
4. Add Material Design: ng add @angular/material
5. Restart ng serve if running and enjoy your Material enabled project!
Often you might stop running apt update && apt dist-upgrade.
Here is the one-line command that will resume reinstalling the unfinished, or half-configured packages for you. It creates list of packages which can be passed to apt install:
grep "08:18:.* half-configured" /var/log/dpkg.log.1 /var/log/dpkg.log | awk '{printf "%s ", $5}'
first part of the command will grab only half-configured packages, while the second part will grab just the package name.
Here is the command in full:
sudo apt install --reinstall $(grep "08:18:.* half-configured" /var/log/dpkg.log.1 /var/log/dpkg.log | awk '{printf "%s ", $5}')
You can configure 08:18 with the time you know the packages were interrupted form installing.
Best luck!
Here are few tips on how to customize your Wordpress, without having to resort to plugins, just insert the following php code inside your functions.php file. I will be adding more.
Redirect inner page to outer domain:
add_action('template_redirect','redirect_from_to');
function redirect_from_to(){
if (is_page('mypage')){
wp_redirect('http://www.google.com',301);
exit()
}
}
Note: mypage must be created in order for the redirect to work.
Allow svg files to be uploaded:
function cc_mime_types($mimes){
$mimes['svg']='image/svg';
return $mimes;
}
add_filter('upload_mimes','cc_mime_types');
Cheers!
We will setup debugging using xdebug with PHP inside of visual studio code.
Quick setup:
1) install php-xdebug:
sudo apt install php-xdebug
2) inside of php.ini at the end of the file set:
[xdebug]
xdebug.start_with_request = yes
xdebug.mode = debug
xdebug.discover_client_host = false
3) install php debug extension in VSCODE and set the port of the vscode php debug extension to 9003.
Now you can press F5 and start debugging.
Alternatively you can install xdebug using pecl.
The setup is valid for Ubuntu both on bare-metal as well as under Windows 10 with WSL.
Enjoy !
Refresh tokens are helpful stateless technology, because they have longer time of expiry than the secure tokens, and can be used to send requests back to the server for reissuing of normal secure tokens.
The primary aim of a refresh token is to regenerate the authentication for the user in such way, that the user doesn't need to manually re-login into the system.
The flow of using refresh together with secure tokens is the following: Initially we're making a request containing valid combination of user/password payload to a server. After performing checks the server is generating and returning to us a pair of secure and refresh tokens. It is sending the refresh token as an http only cookie, which cannot be read or modified by the browser. Later in the process of work, when the secure token is about to expire we use the cookie containing the refresh token information to make request to the server. The server checks its validity in its database and sends back to the client a new pair of refresh secure tokens.
In summary we use refresh tokens when our access token is expired, and we would like to renew it as well as to renew the refresh token. That is why it has longer expiration time than the access token. Keep in mind that, when the refresh token is expired we need to manually re-login the user. For the technical implementation of refresh tokens is very good if you manage to place the refresh token inside of http-only cookie, because on the client side JavaScript and other techniques cannot exploited to modify this type of cookie. In rare cases, if attackers send a refresh request to the server they cannot get the newly issued secure token. If you would like to increase the security of the generated tokens you can also include browser and os fingerprinting inside of the token payload.
For the authentication server it is good it can perform the following specific actions: to be able to generate access and refresh tokens to revoke tokens(to delete the refresh token). When a refresh token is generated it usually goes through the following process: check whether there is an user id in the internal database with a token, check the validity of the token, check the number of tokens for this user: how many they are, because one user can generate and overflow our database and this is also a type of an attack. When everything is ready we can save the newly generated token into our database.
Access token is used when performing service requests
secret key is stored both in the server and in the JWT payload:
const Token = jwt.sign(
{ user: myUser }, // payload
jwtSecretKey,
{ expiresIn: '30d' }
);
on client side resides in local storage
1) Client side authentication - POST request to get the token:
payload: {
‘username:req.body.user’,
’password:req.body.password’
}
Response
Bearer: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwMSIsIm5hbWUiOiJKb2huIERvZSIsImlhdC
2) Client side: request + Authorization header
fetch(url, {
method: 'GET',
withCredentials: true,
credentials: 'include',
headers: {
'Authorization': bearer,
}
})
request service with the token:
3) Server side authorization -
// const token = req.get('Authorization');
// token = token.slice(7, token.length);
app.route(‘/secret_url’).post(
jwtVerifier,
(req,res)=>res.send(‘info’)); // secret information
Refresh token is used when access token is expired in order to produce new access and refresh tokens.
The auth server can perform specific actions:
The practical implementation of both JWT secure and refresh tokens can be seen in these 2 courses:
Angular with PHP and JSON web tokens (JWT)
JavaScript User Authentication Login Script (JWT)
Congratulations !
Here is how to create a simple application with the React front-end framework:
Setup the project:
sudo npm i -g create-react-app // install the dependencies
npx create-react-app my-react-app // create the initial application
npm start //start the live development server
App.js
import React from 'react';
import './App.css';
import Login from './loginComponent'; //default import
import {UsersList} from './usersList'; // specific import
const users = [ // using nested/presentational components
{ name: 'John', occupation: 'student', age: 23 },
{ name: 'Pete', occupation: 'teacher', age: 30 },
{ name: 'Anna', occupation: 'programmer', age: 35 }
];
const App = () => {
// double curly braces because of passing object and because of passing property
return (
<div className="App">
<Login user={{ name: "John", uid: 1000 }} />
<header className="App-header">
<UsersList users = {users} />
</header>
</div>
);
}
export default App;
usersList.js
import React from 'react';
import {UsersListItem} from './usersListItem'; // specific import
export const UsersList = ({ users }) => // getting especially users array from the passed array with props
( // with () we return automatically instead of writing {}
<>
{ users.map(user => <UsersListItem user={user} key={user.name} /> ) }
</>
);
// React will be used when we return react fragment
usersListItem.js
import React from 'react';
const sayMyName = (name) => {
alert(name);
}
export const UsersListItem = ({ user }) =>
(<div>
{user.name} -
{user.occupation} -
{user.age}
<button onClick={ ()=>sayMyName(user.name) }>Display name</button>
</div>
)
loginComponent.js
import React from 'react';
// show the import, conditionals and DOM element, also the export
// how to display props, destructurise in the passing parameters(props)
const Login = ({ user }) => {
let isAdmin = (user.uid) === 1000;
let logged_in = true;
// double conditional
return logged_in ? (
<>
hello mr.{user.name}
{ isAdmin ? `you are admin (conditional)` : null}
</>
)
: (<>please login</>)
}
export default Login;
Here is how to do it very quickly:
For Firebase you'll need to install the following schematics:
ng add @angular/fire
then just do:
ng deploy
you'll be probably asked to authenticate with password in browser, and then your project will be on Internet.
If you would like to youse serverless functions for NodeJS interpretation here is the way:
sudo npm install -g firebase-tools
firebase init functions
This will install and initialize the functions. Then go to the newly created /functions directory and install your packages such as: npm install nodemailer cors etc.
And now is time to edit the auto-generated index.js file.
When you are happy with the generated function you can deploy it, just run from the same directory:
firebase deploy
For Vercel, after the registration just link your github repository to Vercel. You can see/edit your current local git configuration with:
git config --local -e
To link the remote origin of your repository to the local git repo use:
git remote add origin https://github.com/your_username/project.git
if there is something on the remote site, you can overwrite it with the local version using:
git push --set-upstream origin master -f
or just pull and merge the remote version: git pull origin master
Then just do your commits and when pushing you'll have a new version synchronized in Internet.
Congratulations and enjoy the: Angular for beginners - modern TypeScript and RxJS course!
Here is how to do web development using the very fast Ubuntu native LXC/LXD containers. Part of the Practical Ubuntu Linux Server for beginners course.
Then initialize the basic environment:
sudo lxd init
We will also fetch image from a repository: linuxcontainers.org
and will start a container based on it:
sudo lxc launch images:alpine/3.10/amd64 webdevelopment
Let's see what we have in the system:
sudo lxc ls
sudo lxc storage ls
sudo lxc network ls
Now it is time to access the container with: sudo lxc exec webdevelopment sh
and then we will use apk to install openssh
apk add openssh-server
lets' also add unprivileged user in order to access the ssh:
adduser webdev
we will also start the server:
service sshd start
Ok let's check with: ip a the address of the container.
Now we exit the shell(sh) and we can connect to the container using our new user: ssh webdev@ip address of container
Alright, now go back inside the container and will add the Apache service:
apk add apache2
service apache2 restart
Optional:
If we need to get rid of the container we need to stop it first:
sudo lxc stop demo
sudo lxc delete demo
If we need to get rid of the created storage pool, we run the following:
printf 'config: {}\ndevices: {}' | lxc profile edit default
lxc storage delete default
If we need to remove the created network bridge we can run:
sudo lxc network delete lxdbr0
Congratulations and happy learning !
Here is how to install configure and use microk8s with skaffold, step by step. Based on the Kubernetes course:
installation:
curl -Lo skaffold https://storage.googleapis.com/skaffold/releases/latest/skaffold-linux-amd64 && sudo install skaffold /usr/local/bin/
create the initial project skaffold configuration:
skaffold init
create alias to kubectl for skaffold to be able to use it :
sudo snap alias microk8s.kubectl kubectl
provide microk8s config to skaffold:
microk8s.kubectl config view --raw > $HOME/.kube/config
update the pod configuration to use the image from microk8s:
image: localhost:32000/php-app
(add localhost:32000...)
enable microk8s registry addon:
microk8s.enable registry
then test the registry if it works: http://localhost:32000/v2/
skaffold dev --default-repo=localhost:32000
Check if the pod is running:
kubectl get pods
Expose the pod ports to be browsable:
kubectl port-forward pod/skaffold-pod 8080:4000
Optional: In case we need to debug inside the container:
docker run -ti localhost:32000/php-app:latest /bin/bash
Congratulations and enjoy the course !
Here is how to install phpmyadmin on Ubuntu 20.04
References: Practical Ubuntu Linux Server for beginners
We need fist to have mysql-server installed, where phpmyadmin will store its data. For this reason we will run:
sudo apt install mysql-server
Then some libraries for the functioning of phpmyadmin as well as the phpmyadmin package:
sudo apt install phpmyadmin php-mbstring php-zip php-gd php-json php-curl php libapache2-mod-php
Note: if there is a problem in the installation you can Ignore, or Abort the configuration of phpmyadmin.
Let's now go and login inside of MySQL as root:
sudo mysql -u root
or if you already have password/user then: login with: sudo mysql -u user -p
Next we will adjust the MySQL root password, as well as its method of authentication:
ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password BY 'password';
Optional:
Configure Apache in order to serve phpmyadmin (if not already done by installation of phpmyadmin): inside of etc/apache/conf-available/ we create the following phpmyadmin.conf file:
Alias /phpmyadmin /usr/share/phpmyadmin
<Directory /usr/share/phpmyadmin/>
AddDefaultCharset UTF-8
<IfModule mod_authz_core.c>
<RequireAny>
Require all granted
</RequireAny>
</IfModule>
</Directory>
<Directory /usr/share/phpmyadmin/setup/>
<IfModule mod_authz_core.c>
<RequireAny>
Require all granted
</RequireAny>
</IfModule>
</Directory>
Lastly, we need to activate the above configuration file with:
a2enconf phpmyadmin.conf
and then restart the apache2 service to reload and accept the changed configuration:
sudo systemctl restart apache2.service
Now is time to open up in the browser to: http://127.0.0.1/phpmyadmin
and use the combination that we already set: root / password
Congratulations and enjoy learning !
Here is how to create a real-life NodeJS API together with a login form.
Resources:
JavaScript for beginners - learn by doing
Learn Node.js, Express and MongoDB + JWT
We will start with the HTML representing the form as well as its JavaScript functionality:
formLogin.html
<html>
<body>
<form id="myform">
<div>
<label for="email">Email:</label>
<input type="text" id="email" name="email" />
</div>
<div>
<label for="password">Password:</label>
<input type="password" id="password" name="password" />
</div>
<div class="button">
<button type="submit" id="loginUser">Send</button>
</div>
</form>
<div id="result"></div>
<script type="text/javascript">
async function fetchData(url = '', data = {}, method, headers = {}) {
const response = await fetch(
url, {
method,
headers: { 'Content-Type': 'application/json', ...headers },
...data && { body: JSON.stringify(data) },
});
return response.json();
}
let form = document.querySelector('#myform');
if (form) {
form.addEventListener('submit', (e) => {
e.preventDefault();
fetchData(
'/user/login',
{ email: this.email.value, password: this.password.value },
'POST'
).then((result) => {
if (result.token) {
// request the url with token
fetchData('/info', null, 'GET', { Bearer: result.token })
.then((result) => { console.log(result); });
return;
}
document.querySelector('#result').innerHTML = `message: ${result.message}`;
})
.catch(error => console.log('error:', error));
})
}
</script>
</body>
</html>
import express from "express";
import mongoose from "mongoose";
import dotenv from "dotenv";
// import the routes
import routes from "./routes/routes.js";
// create an express instance
const app = express();
app.use(express.json())
// setup the middleware routes
routes(app);
// config the database credentials
dotenv.config();
// connect to the database
mongoose.connect(
process.env.DB_CONNECT,
{ useNewUrlParser: true, useUnifiedTopology: true },
() => console.log("connected to mongoDB")
);
// listen for errors
mongoose.connection.on('error', console.error.bind(console, 'MongoDB connection error:'));
// listen on port 3000
app.listen(3000, () => console.log("server is running"));
application routes: routes.js
import { loginUser } from "../controllers/controller.js";
import { info } from "../controllers/info.js"; // the protected route
import { auth } from "../controllers/verifyToken.js"; // middleware for validating the token
import * as path from 'path';
import { fileURLToPath } from 'url';
const __filename = fileURLToPath(import.meta.url); // The absolute URL of the current file.
const __dirname = path.dirname(__filename); // parse just the directory
const routes = app => {
app.route("/user/login").get((req, res) => { res.sendFile('formLogin.html', { root: path.join(__dirname, "../views") }); });
app.route("/user/login").post((req, res) => loginUser(req, res)); // we capture inside req, and res
app.route("/info").get(auth, (req, res) => info(req, res)); // we capture inside req, and res
// and insert the auth middleware to process the token
};
export default routes;
our main controller: controller.js
import mongoose from "mongoose";
mongoose.set("useCreateIndex", true);
import { userSchema } from "../models/user.js";
import jwt from "jsonwebtoken";
const User = mongoose.model("users", userSchema); // users is the name of our collection!
export const addNewUser = (req, res) => {
User.init(() => {
// init() resolves when the indexes have finished building successfully.
// in order for unique check to work
let newUser = new User(req.body); // just creating w/o saving
newUser.password = newUser.encryptPassword(req.body.password);
newUser.save((err, user) => { // now saving
if (err) {
res.json({ 'message': 'duplicate email' });
}
res.json(user);
});
});
};
export const loginUser = (req, res) => {
if (req.body.password == null || req.body.email == null) {
res.status(400).json({ 'message': 'Please provide email / password' });
return;
}
User.init(() => {
User.findOne({ email: req.body.email }, (err, user) => {
if (err) {
res.json(err);
return;
}
if (user == null) {
res.status(400).json({ 'message': 'Non existing user' });
return;
});
// here user is the fetched user
const validPassword = user.validatePassword(req.body.password, user.password);
if (!validPassword) {
res.status(400).json({ 'message': 'Not valid password' });
return;
}
// create and send a token to be able to use it in further requests
const token = jwt.sign({ _id: user._id }, process.env.TOKEN_SECRET);
res.header("auth-token", token) // set the token in the header of the response
.json({ 'token': token }); // display the token
});
});
};
js helper middleware for working with JWT tokens: verifyToken.js
import jwt from "jsonwebtoken";
export const auth = (req, res, next) => {
const token = req.header("Bearer");
if (!token) return res.status(401).json({'message':'access denied'});
const verified = jwt.verify(token, process.env.TOKEN_SECRET);
if (!verified) res.status(400).message({'message':'Invalid token'});
// continue from the middleware to the next processing middleware :)
next();
};
user database model: user.js
import mongoose from 'mongoose';
import bcrypt from 'bcryptjs';
let userSchema = new mongoose.Schema(
{
email: {
type: String,
requires: "Enter email",
maxlength: 50,
unique: true
},
password: {
type: String,
required: "Enter password",
maxlength: 65
}
},
{
timestamps: true
}
);
userSchema.method({
encryptPassword: (password) => {
return bcrypt.hashSync(password, bcrypt.genSaltSync(5));
},
validatePassword: (pass1, pass2) => {
return bcrypt.compareSync(pass1, pass2);
}
});
export { userSchema };
Congratulations !
References: Docker for web developers course.