chore(docs): update documentation and assets for RA.Aid project using Docusaurus

- Add new versioned documentation for RA.Aid project.
- Include installation instructions, quick starts, and markdown features.
- Add configuration files for Docusaurus setup.
- Introduce new images and logos for branding.
- Create a sidebar for better navigation in documentation.
- Implement a .gitignore file for the docs directory to exclude unnecessary files.

feat(docs): add SVG illustrations for Docusaurus documentation to enhance visual appeal
feat(docs): create tsconfig.json for improved TypeScript support in Docusaurus
fix(pyproject.toml): update dependencies to latest versions for better compatibility and features
fix(__main__.py): improve expert provider selection logic based on available API keys
feat(llm.py): implement function to fetch available OpenAI models and select expert model
fix(file_listing.py): enhance file listing functionality to include hidden files option and improve error handling
fix(deepseek_chat.py): add timeout and max_retries parameters to ChatDeepseekReasoner initialization
fix(version.py): bump version to 0.14.1 for release readiness

feat(models_params.py): add default_temperature to model parameters for consistency and configurability
refactor(interactive.py): enhance run_interactive_command to support expected runtime and improve output capture
fix(prompts.py): update instructions to clarify file modification methods
refactor(provider_strategy.py): streamline expert model selection logic for clarity and maintainability
chore(tool_configs.py): update tool imports to reflect changes in write_file functionality
refactor(agent.py): enhance LLM initialization to include temperature and improve error handling
feat(memory.py): normalize file paths in emit_related_files to prevent duplicates
feat(programmer.py): add get_aider_executable function to retrieve the aider executable path
test: add comprehensive tests for new features and refactor existing tests for clarity and coverage
This commit is contained in:
Ariel Frischer 2025-02-14 13:11:10 -08:00
commit 6970a885e4
65 changed files with 21220 additions and 535 deletions

View File

@ -5,6 +5,47 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.14.1] - 2025-02-13
### Added
- Added expected_runtime_seconds parameter for shell commands with graceful process shutdown
- Added config printing at startup (#88)
### Changed
- Enforce byte limit in interactive commands
- Normalize/dedupe related file paths
- Relax aider version requirement for SWE-bench compatibility
- Upgrade langchain/langgraph dependencies
### Fixed
- Fixed aider flags (#89)
- Fixed write_file_tool references
## [0.14.0] - 2025-02-12
### Added
- Status panel showing tool/LLM status and outputs
- Automatic detection of OpenAI expert models
- Timeouts on LLM clients
### Changed
- Improved interactive TTY process capture and history handling
- Upgraded langgraph dependencies
- Improved prompts and work logging
- Refined token/bytes ratio handling
- Support default temperature on per-model basis
- Reduced tool count for more reliable tool calling
- Updated logo and branding assets
- Set environment variables to disable common interactive modes
### Fixed
- Various test fixes
- Bug fixes for completion message handling and file content operations
- Interactive command input improvements
- Use reasoning_effort=high for OpenAI expert models
- Do not default to o1 model (#82)
- Make current working directory and date available to more agents
## [0.13.2] - 2025-02-02
- Fix temperature parameter error for expert tool.

View File

@ -1,4 +1,4 @@
<img src="assets/logo.png" alt="RA.Aid - Develop software autonomously." style="margin-bottom: 20px;">
<img src="assets/RA-black-bg.png" alt="RA.Aid - Develop software autonomously." style="margin-bottom: 20px;">
[![Python Versions](https://img.shields.io/badge/python-3.8%2B-blue)](https://www.python.org)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue)](LICENSE)

BIN
assets/RA-black-bg.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

BIN
assets/RA-black-square.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

20
docs/.gitignore vendored Normal file
View File

@ -0,0 +1,20 @@
# Dependencies
/node_modules
# Production
/build
# Generated files
.docusaurus
.cache-loader
# Misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local
npm-debug.log*
yarn-debug.log*
yarn-error.log*

41
docs/README.md Normal file
View File

@ -0,0 +1,41 @@
# Website
This website is built using [Docusaurus](https://docusaurus.io/), a modern static website generator.
### Installation
```
$ yarn
```
### Local Development
```
$ yarn start
```
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
### Build
```
$ yarn build
```
This command generates static content into the `build` directory and can be served using any static contents hosting service.
### Deployment
Using SSH:
```
$ USE_SSH=true yarn deploy
```
Not using SSH:
```
$ GIT_USER=<Your GitHub username> yarn deploy
```
If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the `gh-pages` branch.

View File

@ -0,0 +1 @@
---\nsidebar_position: 1\n---\n\n# Getting Started\n\nWelcome to the documentation. This page will help you get started.

48
docs/docs/intro.md Normal file
View File

@ -0,0 +1,48 @@
---
sidebar_position: 1
slug: /
---
# Intro
Let's discover **Docusaurus in less than 5 minutes**.
## Getting Started
Get started by **creating a new site**.
Or **try Docusaurus immediately** with **[docusaurus.new](https://docusaurus.new)**.
### What you'll need
- [Node.js](https://nodejs.org/en/download/) version 18.0 or above:
- When installing Node.js, you are recommended to check all checkboxes related to dependencies.
## Generate a new site
Generate a new Docusaurus site using the **classic template**.
The classic template will automatically be added to your project after you run the command:
```bash
npm init docusaurus@latest my-website classic
```
You can type this command into Command Prompt, Powershell, Terminal, or any other integrated terminal of your code editor.
The command also installs all necessary dependencies you need to run Docusaurus.
## Start your site
Run the development server:
```bash
cd my-website
npm run start
```
The `cd` command changes the directory you're working with. In order to work with your newly created Docusaurus site, you'll need to navigate the terminal there.
The `npm run start` command builds your website locally and serves it through a development server, ready for you to view at http://localhost:3000/.
Open `docs/intro.md` (this page) and edit some lines: the site **reloads automatically** and displays your changes.

View File

@ -0,0 +1,8 @@
{
"label": "Quick Starts",
"position": 2,
"link": {
"type": "generated-index",
"description": "5 minutes to learn the most important Docusaurus concepts."
}
}

View File

@ -0,0 +1,23 @@
---
sidebar_position: 6
---
# Congratulations!
You have just learned the **basics of Docusaurus** and made some changes to the **initial template**.
Docusaurus has **much more to offer**!
Have **5 more minutes**? Take a look at **[versioning](../tutorial-extras/manage-docs-versions.md)** and **[i18n](../tutorial-extras/translate-your-site.md)**.
Anything **unclear** or **buggy** in this tutorial? [Please report it!](https://github.com/facebook/docusaurus/discussions/4610)
## What's next?
- Read the [official documentation](https://docusaurus.io/)
- Modify your site configuration with [`docusaurus.config.js`](https://docusaurus.io/docs/api/docusaurus-config)
- Add navbar and footer items with [`themeConfig`](https://docusaurus.io/docs/api/themes/configuration)
- Add a custom [Design and Layout](https://docusaurus.io/docs/styling-layout)
- Add a [search bar](https://docusaurus.io/docs/search)
- Find inspirations in the [Docusaurus showcase](https://docusaurus.io/showcase)
- Get involved in the [Docusaurus Community](https://docusaurus.io/community/support)

View File

@ -0,0 +1,34 @@
---
sidebar_position: 3
---
# Create a Blog Post
Docusaurus creates a **page for each blog post**, but also a **blog index page**, a **tag system**, an **RSS** feed...
## Create your first Post
Create a file at `blog/2021-02-28-greetings.md`:
```md title="blog/2021-02-28-greetings.md"
---
slug: greetings
title: Greetings!
authors:
- name: Joel Marcey
title: Co-creator of Docusaurus 1
url: https://github.com/JoelMarcey
image_url: https://github.com/JoelMarcey.png
- name: Sébastien Lorber
title: Docusaurus maintainer
url: https://sebastienlorber.com
image_url: https://github.com/slorber.png
tags: [greetings]
---
Congratulations, you have made your first post!
Feel free to play around and edit this post as much as you like.
```
A new blog post is now available at [http://localhost:3000/blog/greetings](http://localhost:3000/blog/greetings).

View File

@ -0,0 +1,57 @@
---
sidebar_position: 2
---
# Create a Document
Documents are **groups of pages** connected through:
- a **sidebar**
- **previous/next navigation**
- **versioning**
## Create your first Doc
Create a Markdown file at `docs/hello.md`:
```md title="docs/hello.md"
# Hello
This is my **first Docusaurus document**!
```
A new document is now available at [http://localhost:3000/docs/hello](http://localhost:3000/docs/hello).
## Configure the Sidebar
Docusaurus automatically **creates a sidebar** from the `docs` folder.
Add metadata to customize the sidebar label and position:
```md title="docs/hello.md" {1-4}
---
sidebar_label: 'Hi!'
sidebar_position: 3
---
# Hello
This is my **first Docusaurus document**!
```
It is also possible to create your sidebar explicitly in `sidebars.js`:
```js title="sidebars.js"
export default {
tutorialSidebar: [
'intro',
// highlight-next-line
'hello',
{
type: 'category',
label: 'Tutorial',
items: ['quickstarts/create-a-document'],
},
],
};
```

View File

View File

@ -0,0 +1,31 @@
---
sidebar_position: 5
---
# Deploy your site
Docusaurus is a **static-site-generator** (also called **[Jamstack](https://jamstack.org/)**).
It builds your site as simple **static HTML, JavaScript and CSS files**.
## Build your site
Build your site **for production**:
```bash
npm run build
```
The static files are generated in the `build` folder.
## Deploy your site
Test your production build locally:
```bash
npm run serve
```
The `build` folder is now served at [http://localhost:3000/](http://localhost:3000/).
You can now deploy the `build` folder **almost anywhere** easily, **for free** or very small cost (read the **[Deployment Guide](https://docusaurus.io/docs/deployment)**).

View File

@ -0,0 +1,14 @@
# Installation
Create a new Python 3.12 virtual environment and install RA.Aid:
```bash
uv venv -p 3.12
source .venv/bin/activate # On Unix/macOS
# or
.venv\Scripts\activate # On Windows
uv pip install ra-aid
```
Once installed, see the [Recommended Configuration](recommended) to set up RA.Aid with the recommended settings.

View File

@ -0,0 +1,152 @@
---
sidebar_position: 4
---
# Markdown Features
Docusaurus supports **[Markdown](https://daringfireball.net/projects/markdown/syntax)** and a few **additional features**.
## Front Matter
Markdown documents have metadata at the top called [Front Matter](https://jekyllrb.com/docs/front-matter/):
```text title="my-doc.md"
// highlight-start
---
id: my-doc-id
title: My document title
description: My document description
slug: /my-custom-url
---
// highlight-end
## Markdown heading
Markdown text with [links](./hello.md)
```
## Links
Regular Markdown links are supported, using url paths or relative file paths.
```md
Let's see how to [Create a page](/recommended).
```
```md
Let's see how to [Create a page](./recommended.md).
```
**Result:** Let's see how to [Create a page](./recommended.md).
## Images
Regular Markdown images are supported.
You can use absolute paths to reference images in the static directory (`static/img/docusaurus.png`):
```md
![Docusaurus logo](/img/docusaurus.png)
```
![Docusaurus logo](/img/docusaurus.png)
You can reference images relative to the current file as well. This is particularly useful to colocate images close to the Markdown files using them:
```md
![Docusaurus logo](./img/docusaurus.png)
```
## Code Blocks
Markdown code blocks are supported with Syntax highlighting.
````md
```jsx title="src/components/HelloDocusaurus.js"
function HelloDocusaurus() {
return <h1>Hello, Docusaurus!</h1>;
}
```
````
```jsx title="src/components/HelloDocusaurus.js"
function HelloDocusaurus() {
return <h1>Hello, Docusaurus!</h1>;
}
```
## Admonitions
Docusaurus has a special syntax to create admonitions and callouts:
```md
:::tip My tip
Use this awesome feature option
:::
:::danger Take care
This action is dangerous
:::
```
:::tip My tip
Use this awesome feature option
:::
:::danger Take care
This action is dangerous
:::
## MDX and React Components
[MDX](https://mdxjs.com/) can make your documentation more **interactive** and allows using any **React components inside Markdown**:
```jsx
export const Highlight = ({children, color}) => (
<span
style={{
backgroundColor: color,
borderRadius: '20px',
color: '#fff',
padding: '10px',
cursor: 'pointer',
}}
onClick={() => {
alert(`You clicked the color ${color} with label ${children}`)
}}>
{children}
</span>
);
This is <Highlight color="#25c2a0">Docusaurus green</Highlight> !
This is <Highlight color="#1877F2">Facebook blue</Highlight> !
```
export const Highlight = ({children, color}) => (
<span
style={{
backgroundColor: color,
borderRadius: '20px',
color: '#fff',
padding: '10px',
cursor: 'pointer',
}}
onClick={() => {
alert(`You clicked the color ${color} with label ${children}`);
}}>
{children}
</span>
);
This is <Highlight color="#25c2a0">Docusaurus green</Highlight> !
This is <Highlight color="#1877F2">Facebook blue</Highlight> !

View File

@ -0,0 +1,52 @@
# Recommended Config
This configuration combines the strengths of multiple AI models to provide the best experience:
- Anthropic Sonnet excels at driving the agent's core reasoning and planning
- OpenAI's models provide robust debugging and logical analysis capabilities
- Tavily web search integration allows the agent to find relevant information online
:::info
RA.Aid must be installed before using these configurations. If you haven't installed it yet, please see the [Installation Guide](installation).
:::
## Getting API Keys
To use RA.Aid with the recommended configuration, you'll need to obtain API keys from the following services:
1. **OpenAI API Key**: Create an account at [OpenAI's platform](https://platform.openai.com) and generate an API key from your dashboard.
2. **Anthropic API Key**: Sign up at [Anthropic's Console](https://console.anthropic.com), then generate an API key from the API Keys section.
3. **Tavily API Key** (optional): Create an account at [Tavily](https://app.tavily.com/sign-in) and get your API key from the dashboard.
Please keep your API keys secure and never share them publicly. Each service has its own pricing and usage terms.
## Configuration
Configure your API keys:
```bash
# For OpenAI (required)
export OPENAI_API_KEY=your_api_key_here
# For Anthropic (required)
export ANTHROPIC_API_KEY=your_api_key_here
# For web search capability (optional)
export TAVILY_API_KEY=your_api_key_here
```
## Basic Usage
Start RA.Aid in interactive chat mode:
```bash
ra-aid --chat
```
Or run with a single command:
```bash
ra-aid -m "Help me understand this code"
```

View File

@ -0,0 +1,7 @@
{
"label": "Tutorial - Extras",
"position": 3,
"link": {
"type": "generated-index"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

View File

@ -0,0 +1,55 @@
---
sidebar_position: 1
---
# Manage Docs Versions
Docusaurus can manage multiple versions of your docs.
## Create a docs version
Release a version 1.0 of your project:
```bash
npm run docusaurus docs:version 1.0
```
The `docs` folder is copied into `versioned_docs/version-1.0` and `versions.json` is created.
Your docs now have 2 versions:
- `1.0` at `http://localhost:3000/docs/` for the version 1.0 docs
- `current` at `http://localhost:3000/docs/next/` for the **upcoming, unreleased docs**
## Add a Version Dropdown
To navigate seamlessly across versions, add a version dropdown.
Modify the `docusaurus.config.js` file:
```js title="docusaurus.config.js"
export default {
themeConfig: {
navbar: {
items: [
// highlight-start
{
type: 'docsVersionDropdown',
},
// highlight-end
],
},
},
};
```
The docs version dropdown appears in your navbar:
![Docs Version Dropdown](./img/docsVersionDropdown.png)
## Update an existing version
It is possible to edit versioned docs in their respective folder:
- `versioned_docs/version-1.0/hello.md` updates `http://localhost:3000/docs/hello`
- `docs/hello.md` updates `http://localhost:3000/docs/next/hello`

View File

@ -0,0 +1,88 @@
---
sidebar_position: 2
---
# Translate your site
Let's translate `docs/intro.md` to French.
## Configure i18n
Modify `docusaurus.config.js` to add support for the `fr` locale:
```js title="docusaurus.config.js"
export default {
i18n: {
defaultLocale: 'en',
locales: ['en', 'fr'],
},
};
```
## Translate a doc
Copy the `docs/intro.md` file to the `i18n/fr` folder:
```bash
mkdir -p i18n/fr/docusaurus-plugin-content-docs/current/
cp docs/intro.md i18n/fr/docusaurus-plugin-content-docs/current/intro.md
```
Translate `i18n/fr/docusaurus-plugin-content-docs/current/intro.md` in French.
## Start your localized site
Start your site on the French locale:
```bash
npm run start -- --locale fr
```
Your localized site is accessible at [http://localhost:3000/fr/](http://localhost:3000/fr/) and the `Getting Started` page is translated.
:::caution
In development, you can only use one locale at a time.
:::
## Add a Locale Dropdown
To navigate seamlessly across languages, add a locale dropdown.
Modify the `docusaurus.config.js` file:
```js title="docusaurus.config.js"
export default {
themeConfig: {
navbar: {
items: [
// highlight-start
{
type: 'localeDropdown',
},
// highlight-end
],
},
},
};
```
The locale dropdown now appears in your navbar:
![Locale Dropdown](./img/localeDropdown.png)
## Build your localized site
Build your site for a specific locale:
```bash
npm run build -- --locale fr
```
Or build your site to include all the locales at once:
```bash
npm run build
```

72
docs/docusaurus.config.ts Normal file
View File

@ -0,0 +1,72 @@
import {themes as prismThemes} from 'prism-react-renderer';
import type {Config} from '@docusaurus/types';
import type * as Preset from '@docusaurus/preset-classic';
const config: Config = {
title: 'RA-Aid Documentation',
favicon: 'img/favicon.ico',
url: 'https://ra-aid.0.dev',
baseUrl: '/',
onDuplicateRoutes: 'ignore',
onBrokenLinks: 'throw',
onBrokenMarkdownLinks: 'warn',
i18n: {
defaultLocale: 'en',
locales: ['en'],
},
presets: [
[
'classic',
{
docs: {
sidebarPath: './sidebars.ts',
routeBasePath: '/',
},
theme: {
customCss: './src/css/custom.css',
},
} satisfies Preset.Options,
],
],
themeConfig: {
navbar: {
logo: {
alt: 'Site Logo',
src: 'img/logo-black-transparent.png',
srcDark: 'img/logo-white-transparent.gif',
href: 'https://ra-aid.ai'
},
items: [
{
type: 'doc',
position: 'left',
docId: 'intro',
label: 'Docs',
},
{
href: 'https://github.com/smallcloudai/refact-aide',
label: 'GitHub',
position: 'right',
},
],
},
footer: {
style: 'dark',
copyright: `Copyright © ${new Date().getFullYear()} My Project, Inc. Built with Docusaurus.`,
},
prism: {
theme: prismThemes.github,
darkTheme: prismThemes.dracula,
},
colorMode: {
defaultMode: 'dark',
respectPrefersColorScheme: false,
},
} satisfies Preset.ThemeConfig,
};
export default config;

17934
docs/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

47
docs/package.json Normal file
View File

@ -0,0 +1,47 @@
{
"name": "docs",
"version": "0.0.0",
"private": true,
"scripts": {
"docusaurus": "docusaurus",
"start": "docusaurus start",
"build": "docusaurus build",
"swizzle": "docusaurus swizzle",
"deploy": "docusaurus deploy",
"clear": "docusaurus clear",
"serve": "docusaurus serve",
"write-translations": "docusaurus write-translations",
"write-heading-ids": "docusaurus write-heading-ids",
"typecheck": "tsc"
},
"dependencies": {
"@docusaurus/core": "3.7.0",
"@docusaurus/preset-classic": "3.7.0",
"@mdx-js/react": "^3.0.0",
"clsx": "^2.0.0",
"prism-react-renderer": "^2.3.0",
"react": "^19.0.0",
"react-dom": "^19.0.0"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "3.7.0",
"@docusaurus/tsconfig": "3.7.0",
"@docusaurus/types": "3.7.0",
"typescript": "~5.6.2"
},
"browserslist": {
"production": [
">0.5%",
"not dead",
"not op_mini all"
],
"development": [
"last 3 chrome version",
"last 3 firefox version",
"last 5 safari version"
]
},
"engines": {
"node": ">=18.0"
}
}

33
docs/sidebars.ts Normal file
View File

@ -0,0 +1,33 @@
import type {SidebarsConfig} from '@docusaurus/plugin-content-docs';
// This runs in Node.js - Don't use client-side code here (browser APIs, JSX...)
/**
* Creating a sidebar enables you to:
- create an ordered group of docs
- render a sidebar for each doc of that group
- provide next/previous navigation
The sidebars can be generated from the filesystem, or explicitly defined here.
Create as many sidebars as you want.
*/
const sidebars: SidebarsConfig = {
// By default, Docusaurus generates a sidebar from the docs folder structure
tutorialSidebar: [{type: 'autogenerated', dirName: '.'}],
// But you can create a sidebar manually
/*
tutorialSidebar: [
'intro',
'hello',
{
type: 'category',
label: 'Tutorial',
items: ['quickstarts/create-a-document'],
},
],
*/
};
export default sidebars;

View File

@ -0,0 +1,71 @@
import type {ReactNode} from 'react';
import clsx from 'clsx';
import Heading from '@theme/Heading';
import styles from './styles.module.css';
type FeatureItem = {
title: string;
Svg: React.ComponentType<React.ComponentProps<'svg'>>;
description: ReactNode;
};
const FeatureList: FeatureItem[] = [
{
title: 'Easy to Use',
Svg: require('@site/static/img/undraw_docusaurus_mountain.svg').default,
description: (
<>
Docusaurus was designed from the ground up to be easily installed and
used to get your website up and running quickly.
</>
),
},
{
title: 'Focus on What Matters',
Svg: require('@site/static/img/undraw_docusaurus_tree.svg').default,
description: (
<>
Docusaurus lets you focus on your docs, and we&apos;ll do the chores. Go
ahead and move your docs into the <code>docs</code> directory.
</>
),
},
{
title: 'Powered by React',
Svg: require('@site/static/img/undraw_docusaurus_react.svg').default,
description: (
<>
Extend or customize your website layout by reusing React. Docusaurus can
be extended while reusing the same header and footer.
</>
),
},
];
function Feature({title, Svg, description}: FeatureItem) {
return (
<div className={clsx('col col--4')}>
<div className="text--center">
<Svg className={styles.featureSvg} role="img" />
</div>
<div className="text--center padding-horiz--md">
<Heading as="h3">{title}</Heading>
<p>{description}</p>
</div>
</div>
);
}
export default function HomepageFeatures(): ReactNode {
return (
<section className={styles.features}>
<div className="container">
<div className="row">
{FeatureList.map((props, idx) => (
<Feature key={idx} {...props} />
))}
</div>
</div>
</section>
);
}

View File

@ -0,0 +1,11 @@
.features {
display: flex;
align-items: center;
padding: 2rem 0;
width: 100%;
}
.featureSvg {
height: 200px;
width: 200px;
}

30
docs/src/css/custom.css Normal file
View File

@ -0,0 +1,30 @@
/**
* Any CSS included here will be global. The classic template
* bundles Infima by default. Infima is a CSS framework designed to
* work well for content-centric websites.
*/
/* You can override the default Infima variables here. */
:root {
--ifm-color-primary: #2e8555;
--ifm-color-primary-dark: #29784c;
--ifm-color-primary-darker: #277148;
--ifm-color-primary-darkest: #205d3b;
--ifm-color-primary-light: #33925d;
--ifm-color-primary-lighter: #359962;
--ifm-color-primary-lightest: #3cad6e;
--ifm-code-font-size: 95%;
--docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.1);
}
/* For readability concerns, you should choose a lighter palette in dark mode. */
[data-theme='dark'] {
--ifm-color-primary: #25c2a0;
--ifm-color-primary-dark: #21af90;
--ifm-color-primary-darker: #1fa588;
--ifm-color-primary-darkest: #1a8870;
--ifm-color-primary-light: #29d5b0;
--ifm-color-primary-lighter: #32d8b4;
--ifm-color-primary-lightest: #4fddbf;
--docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.3);
}

0
docs/static/.nojekyll vendored Normal file
View File

Binary file not shown.

After

Width:  |  Height:  |  Size: 54 KiB

BIN
docs/static/img/docusaurus.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

BIN
docs/static/img/favicon.ico vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

1
docs/static/img/logo.svg vendored Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.3 KiB

View File

@ -0,0 +1,171 @@
<svg xmlns="http://www.w3.org/2000/svg" width="1088" height="687.962" viewBox="0 0 1088 687.962">
<title>Easy to Use</title>
<g id="Group_12" data-name="Group 12" transform="translate(-57 -56)">
<g id="Group_11" data-name="Group 11" transform="translate(57 56)">
<path id="Path_83" data-name="Path 83" d="M1017.81,560.461c-5.27,45.15-16.22,81.4-31.25,110.31-20,38.52-54.21,54.04-84.77,70.28a193.275,193.275,0,0,1-27.46,11.94c-55.61,19.3-117.85,14.18-166.74,3.99a657.282,657.282,0,0,0-104.09-13.16q-14.97-.675-29.97-.67c-15.42.02-293.07,5.29-360.67-131.57-16.69-33.76-28.13-75-32.24-125.27-11.63-142.12,52.29-235.46,134.74-296.47,155.97-115.41,369.76-110.57,523.43,7.88C941.15,276.621,1036.99,396.031,1017.81,560.461Z" transform="translate(-56 -106.019)" fill="#3f3d56"/>
<path id="Path_84" data-name="Path 84" d="M986.56,670.771c-20,38.52-47.21,64.04-77.77,80.28a193.272,193.272,0,0,1-27.46,11.94c-55.61,19.3-117.85,14.18-166.74,3.99a657.3,657.3,0,0,0-104.09-13.16q-14.97-.675-29.97-.67-23.13.03-46.25,1.72c-100.17,7.36-253.82-6.43-321.42-143.29L382,283.981,444.95,445.6l20.09,51.59,55.37-75.98L549,381.981l130.2,149.27,36.8-81.27L970.78,657.9l14.21,11.59Z" transform="translate(-56 -106.019)" fill="#f2f2f2"/>
<path id="Path_85" data-name="Path 85" d="M302,282.962l26-57,36,83-31-60Z" opacity="0.1"/>
<path id="Path_86" data-name="Path 86" d="M610.5,753.821q-14.97-.675-29.97-.67L465.04,497.191Z" transform="translate(-56 -106.019)" opacity="0.1"/>
<path id="Path_87" data-name="Path 87" d="M464.411,315.191,493,292.962l130,150-132-128Z" opacity="0.1"/>
<path id="Path_88" data-name="Path 88" d="M908.79,751.051a193.265,193.265,0,0,1-27.46,11.94L679.2,531.251Z" transform="translate(-56 -106.019)" opacity="0.1"/>
<circle id="Ellipse_11" data-name="Ellipse 11" cx="3" cy="3" r="3" transform="translate(479 98.962)" fill="#f2f2f2"/>
<circle id="Ellipse_12" data-name="Ellipse 12" cx="3" cy="3" r="3" transform="translate(396 201.962)" fill="#f2f2f2"/>
<circle id="Ellipse_13" data-name="Ellipse 13" cx="2" cy="2" r="2" transform="translate(600 220.962)" fill="#f2f2f2"/>
<circle id="Ellipse_14" data-name="Ellipse 14" cx="2" cy="2" r="2" transform="translate(180 265.962)" fill="#f2f2f2"/>
<circle id="Ellipse_15" data-name="Ellipse 15" cx="2" cy="2" r="2" transform="translate(612 96.962)" fill="#f2f2f2"/>
<circle id="Ellipse_16" data-name="Ellipse 16" cx="2" cy="2" r="2" transform="translate(736 192.962)" fill="#f2f2f2"/>
<circle id="Ellipse_17" data-name="Ellipse 17" cx="2" cy="2" r="2" transform="translate(858 344.962)" fill="#f2f2f2"/>
<path id="Path_89" data-name="Path 89" d="M306,121.222h-2.76v-2.76h-1.48v2.76H299V122.7h2.76v2.759h1.48V122.7H306Z" fill="#f2f2f2"/>
<path id="Path_90" data-name="Path 90" d="M848,424.222h-2.76v-2.76h-1.48v2.76H841V425.7h2.76v2.759h1.48V425.7H848Z" fill="#f2f2f2"/>
<path id="Path_91" data-name="Path 91" d="M1144,719.981c0,16.569-243.557,74-544,74s-544-57.431-544-74,243.557,14,544,14S1144,703.413,1144,719.981Z" transform="translate(-56 -106.019)" fill="#3f3d56"/>
<path id="Path_92" data-name="Path 92" d="M1144,719.981c0,16.569-243.557,74-544,74s-544-57.431-544-74,243.557,14,544,14S1144,703.413,1144,719.981Z" transform="translate(-56 -106.019)" opacity="0.1"/>
<ellipse id="Ellipse_18" data-name="Ellipse 18" cx="544" cy="30" rx="544" ry="30" transform="translate(0 583.962)" fill="#3f3d56"/>
<path id="Path_93" data-name="Path 93" d="M624,677.981c0,33.137-14.775,24-33,24s-33,9.137-33-24,33-96,33-96S624,644.844,624,677.981Z" transform="translate(-56 -106.019)" fill="#ff6584"/>
<path id="Path_94" data-name="Path 94" d="M606,690.66c0,15.062-6.716,10.909-15,10.909s-15,4.153-15-10.909,15-43.636,15-43.636S606,675.6,606,690.66Z" transform="translate(-56 -106.019)" opacity="0.1"/>
<rect id="Rectangle_97" data-name="Rectangle 97" width="92" height="18" rx="9" transform="translate(489 604.962)" fill="#2f2e41"/>
<rect id="Rectangle_98" data-name="Rectangle 98" width="92" height="18" rx="9" transform="translate(489 586.962)" fill="#2f2e41"/>
<path id="Path_95" data-name="Path 95" d="M193,596.547c0,55.343,34.719,100.126,77.626,100.126" transform="translate(-56 -106.019)" fill="#3f3d56"/>
<path id="Path_96" data-name="Path 96" d="M270.626,696.673c0-55.965,38.745-101.251,86.626-101.251" transform="translate(-56 -106.019)" fill="#6c63ff"/>
<path id="Path_97" data-name="Path 97" d="M221.125,601.564c0,52.57,22.14,95.109,49.5,95.109" transform="translate(-56 -106.019)" fill="#6c63ff"/>
<path id="Path_98" data-name="Path 98" d="M270.626,696.673c0-71.511,44.783-129.377,100.126-129.377" transform="translate(-56 -106.019)" fill="#3f3d56"/>
<path id="Path_99" data-name="Path 99" d="M254.3,697.379s11.009-.339,14.326-2.7,16.934-5.183,17.757-1.395,16.544,18.844,4.115,18.945-28.879-1.936-32.19-3.953S254.3,697.379,254.3,697.379Z" transform="translate(-56 -106.019)" fill="#a8a8a8"/>
<path id="Path_100" data-name="Path 100" d="M290.716,710.909c-12.429.1-28.879-1.936-32.19-3.953-2.522-1.536-3.527-7.048-3.863-9.591l-.368.014s.7,8.879,4.009,10.9,19.761,4.053,32.19,3.953c3.588-.029,4.827-1.305,4.759-3.2C294.755,710.174,293.386,710.887,290.716,710.909Z" transform="translate(-56 -106.019)" opacity="0.2"/>
<path id="Path_101" data-name="Path 101" d="M777.429,633.081c0,38.029,23.857,68.8,53.341,68.8" transform="translate(-56 -106.019)" fill="#3f3d56"/>
<path id="Path_102" data-name="Path 102" d="M830.769,701.882c0-38.456,26.623-69.575,59.525-69.575" transform="translate(-56 -106.019)" fill="#6c63ff"/>
<path id="Path_103" data-name="Path 103" d="M796.755,636.528c0,36.124,15.213,65.354,34.014,65.354" transform="translate(-56 -106.019)" fill="#6c63ff"/>
<path id="Path_104" data-name="Path 104" d="M830.769,701.882c0-49.139,30.773-88.9,68.8-88.9" transform="translate(-56 -106.019)" fill="#3f3d56"/>
<path id="Path_105" data-name="Path 105" d="M819.548,702.367s7.565-.233,9.844-1.856,11.636-3.562,12.2-.958,11.368,12.949,2.828,13.018-19.844-1.33-22.119-2.716S819.548,702.367,819.548,702.367Z" transform="translate(-56 -106.019)" fill="#a8a8a8"/>
<path id="Path_106" data-name="Path 106" d="M844.574,711.664c-8.54.069-19.844-1.33-22.119-2.716-1.733-1.056-2.423-4.843-2.654-6.59l-.253.01s.479,6.1,2.755,7.487,13.579,2.785,22.119,2.716c2.465-.02,3.317-.9,3.27-2.2C847.349,711.159,846.409,711.649,844.574,711.664Z" transform="translate(-56 -106.019)" opacity="0.2"/>
<path id="Path_107" data-name="Path 107" d="M949.813,724.718s11.36-1.729,14.5-4.591,16.89-7.488,18.217-3.667,19.494,17.447,6.633,19.107-30.153,1.609-33.835-.065S949.813,724.718,949.813,724.718Z" transform="translate(-56 -106.019)" fill="#a8a8a8"/>
<path id="Path_108" data-name="Path 108" d="M989.228,734.173c-12.86,1.659-30.153,1.609-33.835-.065-2.8-1.275-4.535-6.858-5.2-9.45l-.379.061s1.833,9.109,5.516,10.783,20.975,1.725,33.835.065c3.712-.479,4.836-1.956,4.529-3.906C993.319,732.907,991.991,733.817,989.228,734.173Z" transform="translate(-56 -106.019)" opacity="0.2"/>
<path id="Path_109" data-name="Path 109" d="M670.26,723.9s9.587-1.459,12.237-3.875,14.255-6.32,15.374-3.095,16.452,14.725,5.6,16.125-25.448,1.358-28.555-.055S670.26,723.9,670.26,723.9Z" transform="translate(-56 -106.019)" fill="#a8a8a8"/>
<path id="Path_110" data-name="Path 110" d="M703.524,731.875c-10.853,1.4-25.448,1.358-28.555-.055-2.367-1.076-3.827-5.788-4.39-7.976l-.32.051s1.547,7.687,4.655,9.1,17.7,1.456,28.555.055c3.133-.4,4.081-1.651,3.822-3.3C706.977,730.807,705.856,731.575,703.524,731.875Z" transform="translate(-56 -106.019)" opacity="0.2"/>
<path id="Path_111" data-name="Path 111" d="M178.389,719.109s7.463-1.136,9.527-3.016,11.1-4.92,11.969-2.409,12.808,11.463,4.358,12.553-19.811,1.057-22.23-.043S178.389,719.109,178.389,719.109Z" transform="translate(-56 -106.019)" fill="#a8a8a8"/>
<path id="Path_112" data-name="Path 112" d="M204.285,725.321c-8.449,1.09-19.811,1.057-22.23-.043-1.842-.838-2.979-4.506-3.417-6.209l-.249.04s1.2,5.984,3.624,7.085,13.781,1.133,22.23.043c2.439-.315,3.177-1.285,2.976-2.566C206.973,724.489,206.1,725.087,204.285,725.321Z" transform="translate(-56 -106.019)" opacity="0.2"/>
<path id="Path_113" data-name="Path 113" d="M439.7,707.337c0,30.22-42.124,20.873-93.7,20.873s-93.074,9.347-93.074-20.873,42.118-36.793,93.694-36.793S439.7,677.117,439.7,707.337Z" transform="translate(-56 -106.019)" opacity="0.1"/>
<path id="Path_114" data-name="Path 114" d="M439.7,699.9c0,30.22-42.124,20.873-93.7,20.873s-93.074,9.347-93.074-20.873S295.04,663.1,346.616,663.1,439.7,669.676,439.7,699.9Z" transform="translate(-56 -106.019)" fill="#3f3d56"/>
</g>
<g id="docusaurus_keytar" transform="translate(312.271 493.733)">
<path id="Path_40" data-name="Path 40" d="M99,52h91.791V89.153H99Z" transform="translate(5.904 -14.001)" fill="#fff" fill-rule="evenodd"/>
<path id="Path_41" data-name="Path 41" d="M24.855,163.927A21.828,21.828,0,0,1,5.947,153a21.829,21.829,0,0,0,18.908,32.782H46.71V163.927Z" transform="translate(-3 -4.634)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_42" data-name="Path 42" d="M121.861,61.1l76.514-4.782V45.39A21.854,21.854,0,0,0,176.52,23.535H78.173L75.441,18.8a3.154,3.154,0,0,0-5.464,0l-2.732,4.732L64.513,18.8a3.154,3.154,0,0,0-5.464,0l-2.732,4.732L53.586,18.8a3.154,3.154,0,0,0-5.464,0L45.39,23.535c-.024,0-.046,0-.071,0l-4.526-4.525a3.153,3.153,0,0,0-5.276,1.414l-1.5,5.577-5.674-1.521a3.154,3.154,0,0,0-3.863,3.864L26,34.023l-5.575,1.494a3.155,3.155,0,0,0-1.416,5.278l4.526,4.526c0,.023,0,.046,0,.07L18.8,48.122a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,59.05a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,69.977a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,80.9a3.154,3.154,0,0,0,0,5.464L23.535,89.1,18.8,91.832a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,102.76a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,113.687a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,124.615a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,135.542a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,146.469a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,157.4a3.154,3.154,0,0,0,0,5.464l4.732,2.732L18.8,168.324a3.154,3.154,0,0,0,0,5.464l4.732,2.732A21.854,21.854,0,0,0,45.39,198.375H176.52a21.854,21.854,0,0,0,21.855-21.855V89.1l-76.514-4.782a11.632,11.632,0,0,1,0-23.219" transform="translate(-1.681 -17.226)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_43" data-name="Path 43" d="M143,186.71h32.782V143H143Z" transform="translate(9.984 -5.561)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_44" data-name="Path 44" d="M196.71,159.855a5.438,5.438,0,0,0-.7.07c-.042-.164-.081-.329-.127-.493a5.457,5.457,0,1,0-5.4-9.372q-.181-.185-.366-.367a5.454,5.454,0,1,0-9.384-5.4c-.162-.046-.325-.084-.486-.126a5.467,5.467,0,1,0-10.788,0c-.162.042-.325.08-.486.126a5.457,5.457,0,1,0-9.384,5.4,21.843,21.843,0,1,0,36.421,21.02,5.452,5.452,0,1,0,.7-10.858" transform="translate(10.912 -6.025)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_45" data-name="Path 45" d="M153,124.855h32.782V103H153Z" transform="translate(10.912 -9.271)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_46" data-name="Path 46" d="M194.855,116.765a2.732,2.732,0,1,0,0-5.464,2.811,2.811,0,0,0-.349.035c-.022-.082-.04-.164-.063-.246a2.733,2.733,0,0,0-1.052-5.253,2.7,2.7,0,0,0-1.648.566q-.09-.093-.184-.184a2.7,2.7,0,0,0,.553-1.633,2.732,2.732,0,0,0-5.245-1.07,10.928,10.928,0,1,0,0,21.031,2.732,2.732,0,0,0,5.245-1.07,2.7,2.7,0,0,0-.553-1.633q.093-.09.184-.184a2.7,2.7,0,0,0,1.648.566,2.732,2.732,0,0,0,1.052-5.253c.023-.081.042-.164.063-.246a2.814,2.814,0,0,0,.349.035" transform="translate(12.767 -9.377)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_47" data-name="Path 47" d="M65.087,56.891a2.732,2.732,0,0,1-2.732-2.732,8.2,8.2,0,0,0-16.391,0,2.732,2.732,0,0,1-5.464,0,13.659,13.659,0,0,1,27.319,0,2.732,2.732,0,0,1-2.732,2.732" transform="translate(0.478 -15.068)" fill-rule="evenodd"/>
<path id="Path_48" data-name="Path 48" d="M103,191.347h65.565a21.854,21.854,0,0,0,21.855-21.855V93H124.855A21.854,21.854,0,0,0,103,114.855Z" transform="translate(6.275 -10.199)" fill="#ffff50" fill-rule="evenodd"/>
<path id="Path_49" data-name="Path 49" d="M173.216,129.787H118.535a1.093,1.093,0,1,1,0-2.185h54.681a1.093,1.093,0,0,1,0,2.185m0,21.855H118.535a1.093,1.093,0,1,1,0-2.186h54.681a1.093,1.093,0,0,1,0,2.186m0,21.855H118.535a1.093,1.093,0,1,1,0-2.185h54.681a1.093,1.093,0,0,1,0,2.185m0-54.434H118.535a1.093,1.093,0,1,1,0-2.185h54.681a1.093,1.093,0,0,1,0,2.185m0,21.652H118.535a1.093,1.093,0,1,1,0-2.186h54.681a1.093,1.093,0,0,1,0,2.186m0,21.855H118.535a1.093,1.093,0,1,1,0-2.186h54.681a1.093,1.093,0,0,1,0,2.186M189.585,61.611c-.013,0-.024-.007-.037-.005-3.377.115-4.974,3.492-6.384,6.472-1.471,3.114-2.608,5.139-4.473,5.078-2.064-.074-3.244-2.406-4.494-4.874-1.436-2.835-3.075-6.049-6.516-5.929-3.329.114-4.932,3.053-6.346,5.646-1.5,2.762-2.529,4.442-4.5,4.364-2.106-.076-3.225-1.972-4.52-4.167-1.444-2.443-3.112-5.191-6.487-5.1-3.272.113-4.879,2.606-6.3,4.808-1.5,2.328-2.552,3.746-4.551,3.662-2.156-.076-3.27-1.65-4.558-3.472-1.447-2.047-3.077-4.363-6.442-4.251-3.2.109-4.807,2.153-6.224,3.954-1.346,1.709-2.4,3.062-4.621,2.977a1.093,1.093,0,0,0-.079,2.186c3.3.11,4.967-1.967,6.417-3.81,1.286-1.635,2.4-3.045,4.582-3.12,2.1-.09,3.091,1.218,4.584,3.327,1.417,2,3.026,4.277,6.263,4.394,3.391.114,5.022-2.42,6.467-4.663,1.292-2,2.406-3.734,4.535-3.807,1.959-.073,3.026,1.475,4.529,4.022,1.417,2.4,3.023,5.121,6.324,5.241,3.415.118,5.064-2.863,6.5-5.5,1.245-2.282,2.419-4.437,4.5-4.509,1.959-.046,2.981,1.743,4.492,4.732,1.412,2.79,3.013,5.95,6.365,6.071l.185,0c3.348,0,4.937-3.36,6.343-6.331,1.245-2.634,2.423-5.114,4.444-5.216Z" transform="translate(7.109 -13.11)" fill-rule="evenodd"/>
<path id="Path_50" data-name="Path 50" d="M83,186.71h43.71V143H83Z" transform="translate(4.42 -5.561)" fill="#3ecc5f" fill-rule="evenodd"/>
<g id="Group_8" data-name="Group 8" transform="matrix(0.966, -0.259, 0.259, 0.966, 109.327, 91.085)">
<rect id="Rectangle_3" data-name="Rectangle 3" width="92.361" height="36.462" rx="2" transform="translate(0 0)" fill="#d8d8d8"/>
<g id="Group_2" data-name="Group 2" transform="translate(1.531 23.03)">
<rect id="Rectangle_4" data-name="Rectangle 4" width="5.336" height="5.336" rx="1" transform="translate(16.797 0)" fill="#4a4a4a"/>
<rect id="Rectangle_5" data-name="Rectangle 5" width="5.336" height="5.336" rx="1" transform="translate(23.12 0)" fill="#4a4a4a"/>
<rect id="Rectangle_6" data-name="Rectangle 6" width="5.336" height="5.336" rx="1" transform="translate(29.444 0)" fill="#4a4a4a"/>
<rect id="Rectangle_7" data-name="Rectangle 7" width="5.336" height="5.336" rx="1" transform="translate(35.768 0)" fill="#4a4a4a"/>
<rect id="Rectangle_8" data-name="Rectangle 8" width="5.336" height="5.336" rx="1" transform="translate(42.091 0)" fill="#4a4a4a"/>
<rect id="Rectangle_9" data-name="Rectangle 9" width="5.336" height="5.336" rx="1" transform="translate(48.415 0)" fill="#4a4a4a"/>
<rect id="Rectangle_10" data-name="Rectangle 10" width="5.336" height="5.336" rx="1" transform="translate(54.739 0)" fill="#4a4a4a"/>
<rect id="Rectangle_11" data-name="Rectangle 11" width="5.336" height="5.336" rx="1" transform="translate(61.063 0)" fill="#4a4a4a"/>
<rect id="Rectangle_12" data-name="Rectangle 12" width="5.336" height="5.336" rx="1" transform="translate(67.386 0)" fill="#4a4a4a"/>
<path id="Path_51" data-name="Path 51" d="M1.093,0H14.518a1.093,1.093,0,0,1,1.093,1.093V4.243a1.093,1.093,0,0,1-1.093,1.093H1.093A1.093,1.093,0,0,1,0,4.243V1.093A1.093,1.093,0,0,1,1.093,0ZM75,0H88.426a1.093,1.093,0,0,1,1.093,1.093V4.243a1.093,1.093,0,0,1-1.093,1.093H75a1.093,1.093,0,0,1-1.093-1.093V1.093A1.093,1.093,0,0,1,75,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
</g>
<g id="Group_3" data-name="Group 3" transform="translate(1.531 10.261)">
<path id="Path_52" data-name="Path 52" d="M1.093,0H6.218A1.093,1.093,0,0,1,7.31,1.093V4.242A1.093,1.093,0,0,1,6.218,5.335H1.093A1.093,1.093,0,0,1,0,4.242V1.093A1.093,1.093,0,0,1,1.093,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<rect id="Rectangle_13" data-name="Rectangle 13" width="5.336" height="5.336" rx="1" transform="translate(8.299 0)" fill="#4a4a4a"/>
<rect id="Rectangle_14" data-name="Rectangle 14" width="5.336" height="5.336" rx="1" transform="translate(14.623 0)" fill="#4a4a4a"/>
<rect id="Rectangle_15" data-name="Rectangle 15" width="5.336" height="5.336" rx="1" transform="translate(20.947 0)" fill="#4a4a4a"/>
<rect id="Rectangle_16" data-name="Rectangle 16" width="5.336" height="5.336" rx="1" transform="translate(27.271 0)" fill="#4a4a4a"/>
<rect id="Rectangle_17" data-name="Rectangle 17" width="5.336" height="5.336" rx="1" transform="translate(33.594 0)" fill="#4a4a4a"/>
<rect id="Rectangle_18" data-name="Rectangle 18" width="5.336" height="5.336" rx="1" transform="translate(39.918 0)" fill="#4a4a4a"/>
<rect id="Rectangle_19" data-name="Rectangle 19" width="5.336" height="5.336" rx="1" transform="translate(46.242 0)" fill="#4a4a4a"/>
<rect id="Rectangle_20" data-name="Rectangle 20" width="5.336" height="5.336" rx="1" transform="translate(52.565 0)" fill="#4a4a4a"/>
<rect id="Rectangle_21" data-name="Rectangle 21" width="5.336" height="5.336" rx="1" transform="translate(58.888 0)" fill="#4a4a4a"/>
<rect id="Rectangle_22" data-name="Rectangle 22" width="5.336" height="5.336" rx="1" transform="translate(65.212 0)" fill="#4a4a4a"/>
<rect id="Rectangle_23" data-name="Rectangle 23" width="5.336" height="5.336" rx="1" transform="translate(71.536 0)" fill="#4a4a4a"/>
<rect id="Rectangle_24" data-name="Rectangle 24" width="5.336" height="5.336" rx="1" transform="translate(77.859 0)" fill="#4a4a4a"/>
<rect id="Rectangle_25" data-name="Rectangle 25" width="5.336" height="5.336" rx="1" transform="translate(84.183 0)" fill="#4a4a4a"/>
</g>
<g id="Group_4" data-name="Group 4" transform="translate(91.05 9.546) rotate(180)">
<path id="Path_53" data-name="Path 53" d="M1.093,0H6.219A1.093,1.093,0,0,1,7.312,1.093v3.15A1.093,1.093,0,0,1,6.219,5.336H1.093A1.093,1.093,0,0,1,0,4.243V1.093A1.093,1.093,0,0,1,1.093,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<rect id="Rectangle_26" data-name="Rectangle 26" width="5.336" height="5.336" rx="1" transform="translate(8.299 0)" fill="#4a4a4a"/>
<rect id="Rectangle_27" data-name="Rectangle 27" width="5.336" height="5.336" rx="1" transform="translate(14.623 0)" fill="#4a4a4a"/>
<rect id="Rectangle_28" data-name="Rectangle 28" width="5.336" height="5.336" rx="1" transform="translate(20.947 0)" fill="#4a4a4a"/>
<rect id="Rectangle_29" data-name="Rectangle 29" width="5.336" height="5.336" rx="1" transform="translate(27.271 0)" fill="#4a4a4a"/>
<rect id="Rectangle_30" data-name="Rectangle 30" width="5.336" height="5.336" rx="1" transform="translate(33.594 0)" fill="#4a4a4a"/>
<rect id="Rectangle_31" data-name="Rectangle 31" width="5.336" height="5.336" rx="1" transform="translate(39.918 0)" fill="#4a4a4a"/>
<rect id="Rectangle_32" data-name="Rectangle 32" width="5.336" height="5.336" rx="1" transform="translate(46.242 0)" fill="#4a4a4a"/>
<rect id="Rectangle_33" data-name="Rectangle 33" width="5.336" height="5.336" rx="1" transform="translate(52.565 0)" fill="#4a4a4a"/>
<rect id="Rectangle_34" data-name="Rectangle 34" width="5.336" height="5.336" rx="1" transform="translate(58.889 0)" fill="#4a4a4a"/>
<rect id="Rectangle_35" data-name="Rectangle 35" width="5.336" height="5.336" rx="1" transform="translate(65.213 0)" fill="#4a4a4a"/>
<rect id="Rectangle_36" data-name="Rectangle 36" width="5.336" height="5.336" rx="1" transform="translate(71.537 0)" fill="#4a4a4a"/>
<rect id="Rectangle_37" data-name="Rectangle 37" width="5.336" height="5.336" rx="1" transform="translate(77.86 0)" fill="#4a4a4a"/>
<rect id="Rectangle_38" data-name="Rectangle 38" width="5.336" height="5.336" rx="1" transform="translate(84.183 0)" fill="#4a4a4a"/>
<rect id="Rectangle_39" data-name="Rectangle 39" width="5.336" height="5.336" rx="1" transform="translate(8.299 0)" fill="#4a4a4a"/>
<rect id="Rectangle_40" data-name="Rectangle 40" width="5.336" height="5.336" rx="1" transform="translate(14.623 0)" fill="#4a4a4a"/>
<rect id="Rectangle_41" data-name="Rectangle 41" width="5.336" height="5.336" rx="1" transform="translate(20.947 0)" fill="#4a4a4a"/>
<rect id="Rectangle_42" data-name="Rectangle 42" width="5.336" height="5.336" rx="1" transform="translate(27.271 0)" fill="#4a4a4a"/>
<rect id="Rectangle_43" data-name="Rectangle 43" width="5.336" height="5.336" rx="1" transform="translate(33.594 0)" fill="#4a4a4a"/>
<rect id="Rectangle_44" data-name="Rectangle 44" width="5.336" height="5.336" rx="1" transform="translate(39.918 0)" fill="#4a4a4a"/>
<rect id="Rectangle_45" data-name="Rectangle 45" width="5.336" height="5.336" rx="1" transform="translate(46.242 0)" fill="#4a4a4a"/>
<rect id="Rectangle_46" data-name="Rectangle 46" width="5.336" height="5.336" rx="1" transform="translate(52.565 0)" fill="#4a4a4a"/>
<rect id="Rectangle_47" data-name="Rectangle 47" width="5.336" height="5.336" rx="1" transform="translate(58.889 0)" fill="#4a4a4a"/>
<rect id="Rectangle_48" data-name="Rectangle 48" width="5.336" height="5.336" rx="1" transform="translate(65.213 0)" fill="#4a4a4a"/>
<rect id="Rectangle_49" data-name="Rectangle 49" width="5.336" height="5.336" rx="1" transform="translate(71.537 0)" fill="#4a4a4a"/>
<rect id="Rectangle_50" data-name="Rectangle 50" width="5.336" height="5.336" rx="1" transform="translate(77.86 0)" fill="#4a4a4a"/>
<rect id="Rectangle_51" data-name="Rectangle 51" width="5.336" height="5.336" rx="1" transform="translate(84.183 0)" fill="#4a4a4a"/>
</g>
<g id="Group_6" data-name="Group 6" transform="translate(1.531 16.584)">
<path id="Path_54" data-name="Path 54" d="M1.093,0h7.3A1.093,1.093,0,0,1,9.485,1.093v3.15A1.093,1.093,0,0,1,8.392,5.336h-7.3A1.093,1.093,0,0,1,0,4.243V1.094A1.093,1.093,0,0,1,1.093,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<g id="Group_5" data-name="Group 5" transform="translate(10.671 0)">
<rect id="Rectangle_52" data-name="Rectangle 52" width="5.336" height="5.336" rx="1" fill="#4a4a4a"/>
<rect id="Rectangle_53" data-name="Rectangle 53" width="5.336" height="5.336" rx="1" transform="translate(6.324 0)" fill="#4a4a4a"/>
<rect id="Rectangle_54" data-name="Rectangle 54" width="5.336" height="5.336" rx="1" transform="translate(12.647 0)" fill="#4a4a4a"/>
<rect id="Rectangle_55" data-name="Rectangle 55" width="5.336" height="5.336" rx="1" transform="translate(18.971 0)" fill="#4a4a4a"/>
<rect id="Rectangle_56" data-name="Rectangle 56" width="5.336" height="5.336" rx="1" transform="translate(25.295 0)" fill="#4a4a4a"/>
<rect id="Rectangle_57" data-name="Rectangle 57" width="5.336" height="5.336" rx="1" transform="translate(31.619 0)" fill="#4a4a4a"/>
<rect id="Rectangle_58" data-name="Rectangle 58" width="5.336" height="5.336" rx="1" transform="translate(37.942 0)" fill="#4a4a4a"/>
<rect id="Rectangle_59" data-name="Rectangle 59" width="5.336" height="5.336" rx="1" transform="translate(44.265 0)" fill="#4a4a4a"/>
<rect id="Rectangle_60" data-name="Rectangle 60" width="5.336" height="5.336" rx="1" transform="translate(50.589 0)" fill="#4a4a4a"/>
<rect id="Rectangle_61" data-name="Rectangle 61" width="5.336" height="5.336" rx="1" transform="translate(56.912 0)" fill="#4a4a4a"/>
<rect id="Rectangle_62" data-name="Rectangle 62" width="5.336" height="5.336" rx="1" transform="translate(63.236 0)" fill="#4a4a4a"/>
</g>
<path id="Path_55" data-name="Path 55" d="M1.094,0H8A1.093,1.093,0,0,1,9.091,1.093v3.15A1.093,1.093,0,0,1,8,5.336H1.093A1.093,1.093,0,0,1,0,4.243V1.094A1.093,1.093,0,0,1,1.093,0Z" transform="translate(80.428 0)" fill="#4a4a4a" fill-rule="evenodd"/>
</g>
<g id="Group_7" data-name="Group 7" transform="translate(1.531 29.627)">
<rect id="Rectangle_63" data-name="Rectangle 63" width="5.336" height="5.336" rx="1" transform="translate(0 0)" fill="#4a4a4a"/>
<rect id="Rectangle_64" data-name="Rectangle 64" width="5.336" height="5.336" rx="1" transform="translate(6.324 0)" fill="#4a4a4a"/>
<rect id="Rectangle_65" data-name="Rectangle 65" width="5.336" height="5.336" rx="1" transform="translate(12.647 0)" fill="#4a4a4a"/>
<rect id="Rectangle_66" data-name="Rectangle 66" width="5.336" height="5.336" rx="1" transform="translate(18.971 0)" fill="#4a4a4a"/>
<path id="Path_56" data-name="Path 56" d="M1.093,0H31.515a1.093,1.093,0,0,1,1.093,1.093V4.244a1.093,1.093,0,0,1-1.093,1.093H1.093A1.093,1.093,0,0,1,0,4.244V1.093A1.093,1.093,0,0,1,1.093,0ZM34.687,0h3.942a1.093,1.093,0,0,1,1.093,1.093V4.244a1.093,1.093,0,0,1-1.093,1.093H34.687a1.093,1.093,0,0,1-1.093-1.093V1.093A1.093,1.093,0,0,1,34.687,0Z" transform="translate(25.294 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<rect id="Rectangle_67" data-name="Rectangle 67" width="5.336" height="5.336" rx="1" transform="translate(66.003 0)" fill="#4a4a4a"/>
<rect id="Rectangle_68" data-name="Rectangle 68" width="5.336" height="5.336" rx="1" transform="translate(72.327 0)" fill="#4a4a4a"/>
<rect id="Rectangle_69" data-name="Rectangle 69" width="5.336" height="5.336" rx="1" transform="translate(84.183 0)" fill="#4a4a4a"/>
<path id="Path_57" data-name="Path 57" d="M5.336,0V1.18A1.093,1.093,0,0,1,4.243,2.273H1.093A1.093,1.093,0,0,1,0,1.18V0Z" transform="translate(83.59 2.273) rotate(180)" fill="#4a4a4a"/>
<path id="Path_58" data-name="Path 58" d="M5.336,0V1.18A1.093,1.093,0,0,1,4.243,2.273H1.093A1.093,1.093,0,0,1,0,1.18V0Z" transform="translate(78.255 3.063)" fill="#4a4a4a"/>
</g>
<rect id="Rectangle_70" data-name="Rectangle 70" width="88.927" height="2.371" rx="1.085" transform="translate(1.925 1.17)" fill="#4a4a4a"/>
<rect id="Rectangle_71" data-name="Rectangle 71" width="4.986" height="1.581" rx="0.723" transform="translate(4.1 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_72" data-name="Rectangle 72" width="4.986" height="1.581" rx="0.723" transform="translate(10.923 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_73" data-name="Rectangle 73" width="4.986" height="1.581" rx="0.723" transform="translate(16.173 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_74" data-name="Rectangle 74" width="4.986" height="1.581" rx="0.723" transform="translate(21.421 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_75" data-name="Rectangle 75" width="4.986" height="1.581" rx="0.723" transform="translate(26.671 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_76" data-name="Rectangle 76" width="4.986" height="1.581" rx="0.723" transform="translate(33.232 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_77" data-name="Rectangle 77" width="4.986" height="1.581" rx="0.723" transform="translate(38.48 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_78" data-name="Rectangle 78" width="4.986" height="1.581" rx="0.723" transform="translate(43.73 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_79" data-name="Rectangle 79" width="4.986" height="1.581" rx="0.723" transform="translate(48.978 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_80" data-name="Rectangle 80" width="4.986" height="1.581" rx="0.723" transform="translate(55.54 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_81" data-name="Rectangle 81" width="4.986" height="1.581" rx="0.723" transform="translate(60.788 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_82" data-name="Rectangle 82" width="4.986" height="1.581" rx="0.723" transform="translate(66.038 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_83" data-name="Rectangle 83" width="4.986" height="1.581" rx="0.723" transform="translate(72.599 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_84" data-name="Rectangle 84" width="4.986" height="1.581" rx="0.723" transform="translate(77.847 1.566)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_85" data-name="Rectangle 85" width="4.986" height="1.581" rx="0.723" transform="translate(83.097 1.566)" fill="#d8d8d8" opacity="0.136"/>
</g>
<path id="Path_59" data-name="Path 59" d="M146.71,159.855a5.439,5.439,0,0,0-.7.07c-.042-.164-.081-.329-.127-.493a5.457,5.457,0,1,0-5.4-9.372q-.181-.185-.366-.367a5.454,5.454,0,1,0-9.384-5.4c-.162-.046-.325-.084-.486-.126a5.467,5.467,0,1,0-10.788,0c-.162.042-.325.08-.486.126a5.457,5.457,0,1,0-9.384,5.4,21.843,21.843,0,1,0,36.421,21.02,5.452,5.452,0,1,0,.7-10.858" transform="translate(6.275 -6.025)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_60" data-name="Path 60" d="M83,124.855h43.71V103H83Z" transform="translate(4.42 -9.271)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_61" data-name="Path 61" d="M134.855,116.765a2.732,2.732,0,1,0,0-5.464,2.811,2.811,0,0,0-.349.035c-.022-.082-.04-.164-.063-.246a2.733,2.733,0,0,0-1.052-5.253,2.7,2.7,0,0,0-1.648.566q-.09-.093-.184-.184a2.7,2.7,0,0,0,.553-1.633,2.732,2.732,0,0,0-5.245-1.07,10.928,10.928,0,1,0,0,21.031,2.732,2.732,0,0,0,5.245-1.07,2.7,2.7,0,0,0-.553-1.633q.093-.09.184-.184a2.7,2.7,0,0,0,1.648.566,2.732,2.732,0,0,0,1.052-5.253c.023-.081.042-.164.063-.246a2.811,2.811,0,0,0,.349.035" transform="translate(7.202 -9.377)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_62" data-name="Path 62" d="M143.232,42.33a2.967,2.967,0,0,1-.535-.055,2.754,2.754,0,0,1-.514-.153,2.838,2.838,0,0,1-.471-.251,4.139,4.139,0,0,1-.415-.339,3.2,3.2,0,0,1-.338-.415A2.7,2.7,0,0,1,140.5,39.6a2.968,2.968,0,0,1,.055-.535,3.152,3.152,0,0,1,.152-.514,2.874,2.874,0,0,1,.252-.47,2.633,2.633,0,0,1,.753-.754,2.837,2.837,0,0,1,.471-.251,2.753,2.753,0,0,1,.514-.153,2.527,2.527,0,0,1,1.071,0,2.654,2.654,0,0,1,.983.4,4.139,4.139,0,0,1,.415.339,4.019,4.019,0,0,1,.339.415,2.786,2.786,0,0,1,.251.47,2.864,2.864,0,0,1,.208,1.049,2.77,2.77,0,0,1-.8,1.934,4.139,4.139,0,0,1-.415.339,2.722,2.722,0,0,1-1.519.459m21.855-1.366a2.789,2.789,0,0,1-1.935-.8,4.162,4.162,0,0,1-.338-.415,2.7,2.7,0,0,1-.459-1.519,2.789,2.789,0,0,1,.8-1.934,4.139,4.139,0,0,1,.415-.339,2.838,2.838,0,0,1,.471-.251,2.752,2.752,0,0,1,.514-.153,2.527,2.527,0,0,1,1.071,0,2.654,2.654,0,0,1,.983.4,4.139,4.139,0,0,1,.415.339,2.79,2.79,0,0,1,.8,1.934,3.069,3.069,0,0,1-.055.535,2.779,2.779,0,0,1-.153.514,3.885,3.885,0,0,1-.251.47,4.02,4.02,0,0,1-.339.415,4.138,4.138,0,0,1-.415.339,2.722,2.722,0,0,1-1.519.459" transform="translate(9.753 -15.532)" fill-rule="evenodd"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 31 KiB

View File

@ -0,0 +1,170 @@
<svg xmlns="http://www.w3.org/2000/svg" width="1041.277" height="554.141" viewBox="0 0 1041.277 554.141">
<title>Powered by React</title>
<g id="Group_24" data-name="Group 24" transform="translate(-440 -263)">
<g id="Group_23" data-name="Group 23" transform="translate(439.989 262.965)">
<path id="Path_299" data-name="Path 299" d="M1040.82,611.12q-1.74,3.75-3.47,7.4-2.7,5.67-5.33,11.12c-.78,1.61-1.56,3.19-2.32,4.77-8.6,17.57-16.63,33.11-23.45,45.89A73.21,73.21,0,0,1,942.44,719l-151.65,1.65h-1.6l-13,.14-11.12.12-34.1.37h-1.38l-17.36.19h-.53l-107,1.16-95.51,1-11.11.12-69,.75H429l-44.75.48h-.48l-141.5,1.53-42.33.46a87.991,87.991,0,0,1-10.79-.54h0c-1.22-.14-2.44-.3-3.65-.49a87.38,87.38,0,0,1-51.29-27.54C116,678.37,102.75,655,93.85,629.64q-1.93-5.49-3.6-11.12C59.44,514.37,97,380,164.6,290.08q4.25-5.64,8.64-11l.07-.08c20.79-25.52,44.1-46.84,68.93-62,44-26.91,92.75-34.49,140.7-11.9,40.57,19.12,78.45,28.11,115.17,30.55,3.71.24,7.42.42,11.11.53,84.23,2.65,163.17-27.7,255.87-47.29,3.69-.78,7.39-1.55,11.12-2.28,66.13-13.16,139.49-20.1,226.73-5.51a189.089,189.089,0,0,1,26.76,6.4q5.77,1.86,11.12,4c41.64,16.94,64.35,48.24,74,87.46q1.37,5.46,2.37,11.11C1134.3,384.41,1084.19,518.23,1040.82,611.12Z" transform="translate(-79.34 -172.91)" fill="#f2f2f2"/>
<path id="Path_300" data-name="Path 300" d="M576.36,618.52a95.21,95.21,0,0,1-1.87,11.12h93.7V618.52Zm-78.25,62.81,11.11-.09V653.77c-3.81-.17-7.52-.34-11.11-.52ZM265.19,618.52v11.12h198.5V618.52ZM1114.87,279h-74V191.51q-5.35-2.17-11.12-4V279H776.21V186.58c-3.73.73-7.43,1.5-11.12,2.28V279H509.22V236.15c-3.69-.11-7.4-.29-11.11-.53V279H242.24V217c-24.83,15.16-48.14,36.48-68.93,62h-.07v.08q-4.4,5.4-8.64,11h8.64V618.52h-83q1.66,5.63,3.6,11.12h79.39v93.62a87,87,0,0,0,12.2,2.79c1.21.19,2.43.35,3.65.49h0a87.991,87.991,0,0,0,10.79.54l42.33-.46v-97H498.11v94.21l11.11-.12V629.64H765.09V721l11.12-.12V629.64H1029.7v4.77c.76-1.58,1.54-3.16,2.32-4.77q2.63-5.45,5.33-11.12,1.73-3.64,3.47-7.4v-321h76.42Q1116.23,284.43,1114.87,279ZM242.24,618.52V290.08H498.11V618.52Zm267,0V290.08H765.09V618.52Zm520.48,0H776.21V290.08H1029.7Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_301" data-name="Path 301" d="M863.09,533.65v13l-151.92,1.4-1.62.03-57.74.53-1.38.02-17.55.15h-.52l-106.98.99L349.77,551.4h-.15l-44.65.42-.48.01-198.4,1.82v-15l46.65-28,93.6-.78,2-.01.66-.01,2-.03,44.94-.37,2.01-.01.64-.01,2-.01L315,509.3l.38-.01,35.55-.3h.29l277.4-2.34,6.79-.05h.68l5.18-.05,37.65-.31,2-.03,1.85-.02h.96l11.71-.09,2.32-.03,3.11-.02,9.75-.09,15.47-.13,2-.02,3.48-.02h.65l74.71-.64Z" fill="#65617d"/>
<path id="Path_302" data-name="Path 302" d="M863.09,533.65v13l-151.92,1.4-1.62.03-57.74.53-1.38.02-17.55.15h-.52l-106.98.99L349.77,551.4h-.15l-44.65.42-.48.01-198.4,1.82v-15l46.65-28,93.6-.78,2-.01.66-.01,2-.03,44.94-.37,2.01-.01.64-.01,2-.01L315,509.3l.38-.01,35.55-.3h.29l277.4-2.34,6.79-.05h.68l5.18-.05,37.65-.31,2-.03,1.85-.02h.96l11.71-.09,2.32-.03,3.11-.02,9.75-.09,15.47-.13,2-.02,3.48-.02h.65l74.71-.64Z" opacity="0.2"/>
<path id="Path_303" data-name="Path 303" d="M375.44,656.57v24.49a6.13,6.13,0,0,1-3.5,5.54,6,6,0,0,1-2.5.6l-34.9.74a6,6,0,0,1-2.7-.57,6.12,6.12,0,0,1-3.57-5.57V656.57Z" transform="translate(-79.34 -172.91)" fill="#3f3d56"/>
<path id="Path_304" data-name="Path 304" d="M375.44,656.57v24.49a6.13,6.13,0,0,1-3.5,5.54,6,6,0,0,1-2.5.6l-34.9.74a6,6,0,0,1-2.7-.57,6.12,6.12,0,0,1-3.57-5.57V656.57Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_305" data-name="Path 305" d="M377.44,656.57v24.49a6.13,6.13,0,0,1-3.5,5.54,6,6,0,0,1-2.5.6l-34.9.74a6,6,0,0,1-2.7-.57,6.12,6.12,0,0,1-3.57-5.57V656.57Z" transform="translate(-79.34 -172.91)" fill="#3f3d56"/>
<rect id="Rectangle_137" data-name="Rectangle 137" width="47.17" height="31.5" transform="translate(680.92 483.65)" fill="#3f3d56"/>
<rect id="Rectangle_138" data-name="Rectangle 138" width="47.17" height="31.5" transform="translate(680.92 483.65)" opacity="0.1"/>
<rect id="Rectangle_139" data-name="Rectangle 139" width="47.17" height="31.5" transform="translate(678.92 483.65)" fill="#3f3d56"/>
<path id="Path_306" data-name="Path 306" d="M298.09,483.65v4.97l-47.17,1.26v-6.23Z" opacity="0.1"/>
<path id="Path_307" data-name="Path 307" d="M460.69,485.27v168.2a4,4,0,0,1-3.85,3.95l-191.65,5.1h-.05a4,4,0,0,1-3.95-3.95V485.27a4,4,0,0,1,3.95-3.95h191.6a4,4,0,0,1,3.95,3.95Z" transform="translate(-79.34 -172.91)" fill="#65617d"/>
<path id="Path_308" data-name="Path 308" d="M265.19,481.32v181.2h-.05a4,4,0,0,1-3.95-3.95V485.27a4,4,0,0,1,3.95-3.95Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_309" data-name="Path 309" d="M194.59,319.15h177.5V467.4l-177.5,4Z" fill="#39374d"/>
<path id="Path_310" data-name="Path 310" d="M726.09,483.65v6.41l-47.17-1.26v-5.15Z" opacity="0.1"/>
<path id="Path_311" data-name="Path 311" d="M867.69,485.27v173.3a4,4,0,0,1-4,3.95h0L672,657.42a4,4,0,0,1-3.85-3.95V485.27a4,4,0,0,1,3.95-3.95H863.7a4,4,0,0,1,3.99,3.95Z" transform="translate(-79.34 -172.91)" fill="#65617d"/>
<path id="Path_312" data-name="Path 312" d="M867.69,485.27v173.3a4,4,0,0,1-4,3.95h0V481.32h0a4,4,0,0,1,4,3.95Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_313" data-name="Path 313" d="M775.59,319.15H598.09V467.4l177.5,4Z" fill="#39374d"/>
<path id="Path_314" data-name="Path 314" d="M663.19,485.27v168.2a4,4,0,0,1-3.85,3.95l-191.65,5.1h0a4,4,0,0,1-4-3.95V485.27a4,4,0,0,1,3.95-3.95h191.6A4,4,0,0,1,663.19,485.27Z" transform="translate(-79.34 -172.91)" fill="#65617d"/>
<path id="Path_315" data-name="Path 315" d="M397.09,319.15h177.5V467.4l-177.5,4Z" fill="#4267b2"/>
<path id="Path_316" data-name="Path 316" d="M863.09,533.65v13l-151.92,1.4-1.62.03-57.74.53-1.38.02-17.55.15h-.52l-106.98.99L349.77,551.4h-.15l-44.65.42-.48.01-198.4,1.82v-15l202.51-1.33h.48l40.99-.28h.19l283.08-1.87h.29l.17-.01h.47l4.79-.03h1.46l74.49-.5,4.4-.02.98-.01Z" opacity="0.1"/>
<circle id="Ellipse_111" data-name="Ellipse 111" cx="51.33" cy="51.33" r="51.33" transform="translate(435.93 246.82)" fill="#fbbebe"/>
<path id="Path_317" data-name="Path 317" d="M617.94,550.07s-99.5,12-90,0c3.44-4.34,4.39-17.2,4.2-31.85-.06-4.45-.22-9.06-.45-13.65-1.1-22-3.75-43.5-3.75-43.5s87-41,77-8.5c-4,13.13-2.69,31.57.35,48.88.89,5.05,1.92,10,3,14.7a344.66,344.66,0,0,0,9.65,33.92Z" transform="translate(-79.34 -172.91)" fill="#fbbebe"/>
<path id="Path_318" data-name="Path 318" d="M585.47,546c11.51-2.13,23.7-6,34.53-1.54,2.85,1.17,5.47,2.88,8.39,3.86s6.12,1.22,9.16,1.91c10.68,2.42,19.34,10.55,24.9,20s8.44,20.14,11.26,30.72l6.9,25.83c6,22.45,12,45.09,13.39,68.3a2437.506,2437.506,0,0,1-250.84,1.43c5.44-10.34,11-21.31,10.54-33s-7.19-23.22-4.76-34.74c1.55-7.34,6.57-13.39,9.64-20.22,8.75-19.52,1.94-45.79,17.32-60.65,6.92-6.68,17-9.21,26.63-8.89,12.28.41,24.85,4.24,37,6.11C555.09,547.48,569.79,548.88,585.47,546Z" transform="translate(-79.34 -172.91)" fill="#ff6584"/>
<path id="Path_319" data-name="Path 319" d="M716.37,657.17l-.1,1.43v.1l-.17,2.3-1.33,18.51-1.61,22.3-.46,6.28-1,13.44v.17l-107,1-175.59,1.9v.84h-.14v-1.12l.45-14.36.86-28.06.74-23.79.07-2.37a10.53,10.53,0,0,1,11.42-10.17c4.72.4,10.85.89,18.18,1.41l3,.22c42.33,2.94,120.56,6.74,199.5,2,1.66-.09,3.33-.19,5-.31,12.24-.77,24.47-1.76,36.58-3a10.53,10.53,0,0,1,11.6,11.23Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_320" data-name="Path 320" d="M429.08,725.44v-.84l175.62-1.91,107-1h.3v-.17l1-13.44.43-6,1.64-22.61,1.29-17.9v-.44a10.617,10.617,0,0,0-.11-2.47.3.3,0,0,0,0-.1,10.391,10.391,0,0,0-2-4.64,10.54,10.54,0,0,0-9.42-4c-12.11,1.24-24.34,2.23-36.58,3-1.67.12-3.34.22-5,.31-78.94,4.69-157.17.89-199.5-2l-3-.22c-7.33-.52-13.46-1-18.18-1.41a10.54,10.54,0,0,0-11.24,8.53,11,11,0,0,0-.18,1.64l-.68,22.16L429.54,710l-.44,14.36v1.12Z" transform="translate(-79.34 -172.91)" fill="#3f3d56"/>
<path id="Path_321" data-name="Path 321" d="M716.67,664.18l-1.23,15.33-1.83,22.85-.46,5.72-1,12.81-.06.64v.17h0l-.15,1.48.11-1.48h-.29l-107,1-175.65,1.9v-.28l.49-14.36,1-28.06.64-18.65A6.36,6.36,0,0,1,434.3,658a6.25,6.25,0,0,1,3.78-.9c2.1.17,4.68.37,7.69.59,4.89.36,10.92.78,17.94,1.22,13,.82,29.31,1.7,48,2.42,52,2,122.2,2.67,188.88-3.17,3-.26,6.1-.55,9.13-.84a6.26,6.26,0,0,1,3.48.66,5.159,5.159,0,0,1,.86.54,6.14,6.14,0,0,1,2,2.46,3.564,3.564,0,0,1,.25.61A6.279,6.279,0,0,1,716.67,664.18Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_322" data-name="Path 322" d="M377.44,677.87v3.19a6.13,6.13,0,0,1-3.5,5.54l-40.1.77a6.12,6.12,0,0,1-3.57-5.57v-3Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_323" data-name="Path 323" d="M298.59,515.57l-52.25,1V507.9l52.25-1Z" fill="#3f3d56"/>
<path id="Path_324" data-name="Path 324" d="M298.59,515.57l-52.25,1V507.9l52.25-1Z" opacity="0.1"/>
<path id="Path_325" data-name="Path 325" d="M300.59,515.57l-52.25,1V507.9l52.25-1Z" fill="#3f3d56"/>
<path id="Path_326" data-name="Path 326" d="M758.56,679.87v3.19a6.13,6.13,0,0,0,3.5,5.54l40.1.77a6.12,6.12,0,0,0,3.57-5.57v-3Z" transform="translate(-79.34 -172.91)" opacity="0.1"/>
<path id="Path_327" data-name="Path 327" d="M678.72,517.57l52.25,1V509.9l-52.25-1Z" opacity="0.1"/>
<path id="Path_328" data-name="Path 328" d="M676.72,517.57l52.25,1V509.9l-52.25-1Z" fill="#3f3d56"/>
<path id="Path_329" data-name="Path 329" d="M534.13,486.79c.08,7-3.16,13.6-5.91,20.07a163.491,163.491,0,0,0-12.66,74.71c.73,11,2.58,22,.73,32.9s-8.43,21.77-19,24.9c17.53,10.45,41.26,9.35,57.76-2.66,8.79-6.4,15.34-15.33,21.75-24.11a97.86,97.86,0,0,1-13.31,44.75A103.43,103.43,0,0,0,637,616.53c4.31-5.81,8.06-12.19,9.72-19.23,3.09-13-1.22-26.51-4.51-39.5a266.055,266.055,0,0,1-6.17-33c-.43-3.56-.78-7.22.1-10.7,1-4.07,3.67-7.51,5.64-11.22,5.6-10.54,5.73-23.3,2.86-34.88s-8.49-22.26-14.06-32.81c-4.46-8.46-9.3-17.31-17.46-22.28-5.1-3.1-11-4.39-16.88-5.64l-25.37-5.43c-5.55-1.19-11.26-2.38-16.87-1.51-9.47,1.48-16.14,8.32-22,15.34-4.59,5.46-15.81,15.71-16.6,22.86-.72,6.59,5.1,17.63,6.09,24.58,1.3,9,2.22,6,7.3,11.52C532,478.05,534.07,482,534.13,486.79Z" transform="translate(-79.34 -172.91)" fill="#3f3d56"/>
</g>
<g id="docusaurus_keytar" transform="translate(670.271 615.768)">
<path id="Path_40" data-name="Path 40" d="M99,52h43.635V69.662H99Z" transform="translate(-49.132 -33.936)" fill="#fff" fill-rule="evenodd"/>
<path id="Path_41" data-name="Path 41" d="M13.389,158.195A10.377,10.377,0,0,1,4.4,153a10.377,10.377,0,0,0,8.988,15.584H23.779V158.195Z" transform="translate(-3 -82.47)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_42" data-name="Path 42" d="M66.967,38.083l36.373-2.273V30.615A10.389,10.389,0,0,0,92.95,20.226H46.2l-1.3-2.249a1.5,1.5,0,0,0-2.6,0L41,20.226l-1.3-2.249a1.5,1.5,0,0,0-2.6,0l-1.3,2.249-1.3-2.249a1.5,1.5,0,0,0-2.6,0l-1.3,2.249-.034,0-2.152-2.151a1.5,1.5,0,0,0-2.508.672L25.21,21.4l-2.7-.723a1.5,1.5,0,0,0-1.836,1.837l.722,2.7-2.65.71a1.5,1.5,0,0,0-.673,2.509l2.152,2.152c0,.011,0,.022,0,.033l-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6L20.226,41l-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3-2.249,1.3a1.5,1.5,0,0,0,0,2.6l2.249,1.3A10.389,10.389,0,0,0,30.615,103.34H92.95A10.389,10.389,0,0,0,103.34,92.95V51.393L66.967,49.12a5.53,5.53,0,0,1,0-11.038" transform="translate(-9.836 -17.226)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_43" data-name="Path 43" d="M143,163.779h15.584V143H143Z" transform="translate(-70.275 -77.665)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_44" data-name="Path 44" d="M173.779,148.389a2.582,2.582,0,0,0-.332.033c-.02-.078-.038-.156-.06-.234a2.594,2.594,0,1,0-2.567-4.455q-.086-.088-.174-.175a2.593,2.593,0,1,0-4.461-2.569c-.077-.022-.154-.04-.231-.06a2.6,2.6,0,1,0-5.128,0c-.077.02-.154.038-.231.06a2.594,2.594,0,1,0-4.461,2.569,10.384,10.384,0,1,0,17.314,9.992,2.592,2.592,0,1,0,.332-5.161" transform="translate(-75.08 -75.262)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_45" data-name="Path 45" d="M153,113.389h15.584V103H153Z" transform="translate(-75.08 -58.444)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_46" data-name="Path 46" d="M183.389,108.944a1.3,1.3,0,1,0,0-2.6,1.336,1.336,0,0,0-.166.017c-.01-.039-.019-.078-.03-.117a1.3,1.3,0,0,0-.5-2.5,1.285,1.285,0,0,0-.783.269q-.043-.044-.087-.087a1.285,1.285,0,0,0,.263-.776,1.3,1.3,0,0,0-2.493-.509,5.195,5.195,0,1,0,0,10,1.3,1.3,0,0,0,2.493-.509,1.285,1.285,0,0,0-.263-.776q.044-.043.087-.087a1.285,1.285,0,0,0,.783.269,1.3,1.3,0,0,0,.5-2.5c.011-.038.02-.078.03-.117a1.337,1.337,0,0,0,.166.017" transform="translate(-84.691 -57.894)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_47" data-name="Path 47" d="M52.188,48.292a1.3,1.3,0,0,1-1.3-1.3,3.9,3.9,0,0,0-7.792,0,1.3,1.3,0,1,1-2.6,0,6.493,6.493,0,0,1,12.987,0,1.3,1.3,0,0,1-1.3,1.3" transform="translate(-21.02 -28.41)" fill-rule="evenodd"/>
<path id="Path_48" data-name="Path 48" d="M103,139.752h31.168a10.389,10.389,0,0,0,10.389-10.389V93H113.389A10.389,10.389,0,0,0,103,103.389Z" transform="translate(-51.054 -53.638)" fill="#ffff50" fill-rule="evenodd"/>
<path id="Path_49" data-name="Path 49" d="M141.1,94.017H115.106a.519.519,0,1,1,0-1.039H141.1a.519.519,0,0,1,0,1.039m0,10.389H115.106a.519.519,0,1,1,0-1.039H141.1a.519.519,0,0,1,0,1.039m0,10.389H115.106a.519.519,0,1,1,0-1.039H141.1a.519.519,0,0,1,0,1.039m0-25.877H115.106a.519.519,0,1,1,0-1.039H141.1a.519.519,0,0,1,0,1.039m0,10.293H115.106a.519.519,0,1,1,0-1.039H141.1a.519.519,0,0,1,0,1.039m0,10.389H115.106a.519.519,0,1,1,0-1.039H141.1a.519.519,0,0,1,0,1.039m7.782-47.993c-.006,0-.011,0-.018,0-1.605.055-2.365,1.66-3.035,3.077-.7,1.48-1.24,2.443-2.126,2.414-.981-.035-1.542-1.144-2.137-2.317-.683-1.347-1.462-2.876-3.1-2.819-1.582.054-2.344,1.451-3.017,2.684-.715,1.313-1.2,2.112-2.141,2.075-1-.036-1.533-.938-2.149-1.981-.686-1.162-1.479-2.467-3.084-2.423-1.555.053-2.319,1.239-2.994,2.286-.713,1.106-1.213,1.781-2.164,1.741-1.025-.036-1.554-.784-2.167-1.65-.688-.973-1.463-2.074-3.062-2.021a3.815,3.815,0,0,0-2.959,1.879c-.64.812-1.14,1.456-2.2,1.415a.52.52,0,0,0-.037,1.039,3.588,3.588,0,0,0,3.05-1.811c.611-.777,1.139-1.448,2.178-1.483,1-.043,1.47.579,2.179,1.582.674.953,1.438,2.033,2.977,2.089,1.612.054,2.387-1.151,3.074-2.217.614-.953,1.144-1.775,2.156-1.81.931-.035,1.438.7,2.153,1.912.674,1.141,1.437,2.434,3.006,2.491,1.623.056,2.407-1.361,3.09-2.616.592-1.085,1.15-2.109,2.14-2.143.931-.022,1.417.829,2.135,2.249.671,1.326,1.432,2.828,3.026,2.886l.088,0c1.592,0,2.347-1.6,3.015-3.01.592-1.252,1.152-2.431,2.113-2.479Z" transform="translate(-55.378 -38.552)" fill-rule="evenodd"/>
<path id="Path_50" data-name="Path 50" d="M83,163.779h20.779V143H83Z" transform="translate(-41.443 -77.665)" fill="#3ecc5f" fill-rule="evenodd"/>
<g id="Group_8" data-name="Group 8" transform="matrix(0.966, -0.259, 0.259, 0.966, 51.971, 43.3)">
<rect id="Rectangle_3" data-name="Rectangle 3" width="43.906" height="17.333" rx="2" transform="translate(0 0)" fill="#d8d8d8"/>
<g id="Group_2" data-name="Group 2" transform="translate(0.728 10.948)">
<rect id="Rectangle_4" data-name="Rectangle 4" width="2.537" height="2.537" rx="1" transform="translate(7.985 0)" fill="#4a4a4a"/>
<rect id="Rectangle_5" data-name="Rectangle 5" width="2.537" height="2.537" rx="1" transform="translate(10.991 0)" fill="#4a4a4a"/>
<rect id="Rectangle_6" data-name="Rectangle 6" width="2.537" height="2.537" rx="1" transform="translate(13.997 0)" fill="#4a4a4a"/>
<rect id="Rectangle_7" data-name="Rectangle 7" width="2.537" height="2.537" rx="1" transform="translate(17.003 0)" fill="#4a4a4a"/>
<rect id="Rectangle_8" data-name="Rectangle 8" width="2.537" height="2.537" rx="1" transform="translate(20.009 0)" fill="#4a4a4a"/>
<rect id="Rectangle_9" data-name="Rectangle 9" width="2.537" height="2.537" rx="1" transform="translate(23.015 0)" fill="#4a4a4a"/>
<rect id="Rectangle_10" data-name="Rectangle 10" width="2.537" height="2.537" rx="1" transform="translate(26.021 0)" fill="#4a4a4a"/>
<rect id="Rectangle_11" data-name="Rectangle 11" width="2.537" height="2.537" rx="1" transform="translate(29.028 0)" fill="#4a4a4a"/>
<rect id="Rectangle_12" data-name="Rectangle 12" width="2.537" height="2.537" rx="1" transform="translate(32.034 0)" fill="#4a4a4a"/>
<path id="Path_51" data-name="Path 51" d="M.519,0H6.9A.519.519,0,0,1,7.421.52v1.5a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,2.017V.519A.519.519,0,0,1,.519,0ZM35.653,0h6.383a.519.519,0,0,1,.519.519v1.5a.519.519,0,0,1-.519.519H35.652a.519.519,0,0,1-.519-.519V.519A.519.519,0,0,1,35.652,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
</g>
<g id="Group_3" data-name="Group 3" transform="translate(0.728 4.878)">
<path id="Path_52" data-name="Path 52" d="M.519,0H2.956a.519.519,0,0,1,.519.519v1.5a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,2.017V.519A.519.519,0,0,1,.519,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<rect id="Rectangle_13" data-name="Rectangle 13" width="2.537" height="2.537" rx="1" transform="translate(3.945 0)" fill="#4a4a4a"/>
<rect id="Rectangle_14" data-name="Rectangle 14" width="2.537" height="2.537" rx="1" transform="translate(6.951 0)" fill="#4a4a4a"/>
<rect id="Rectangle_15" data-name="Rectangle 15" width="2.537" height="2.537" rx="1" transform="translate(9.958 0)" fill="#4a4a4a"/>
<rect id="Rectangle_16" data-name="Rectangle 16" width="2.537" height="2.537" rx="1" transform="translate(12.964 0)" fill="#4a4a4a"/>
<rect id="Rectangle_17" data-name="Rectangle 17" width="2.537" height="2.537" rx="1" transform="translate(15.97 0)" fill="#4a4a4a"/>
<rect id="Rectangle_18" data-name="Rectangle 18" width="2.537" height="2.537" rx="1" transform="translate(18.976 0)" fill="#4a4a4a"/>
<rect id="Rectangle_19" data-name="Rectangle 19" width="2.537" height="2.537" rx="1" transform="translate(21.982 0)" fill="#4a4a4a"/>
<rect id="Rectangle_20" data-name="Rectangle 20" width="2.537" height="2.537" rx="1" transform="translate(24.988 0)" fill="#4a4a4a"/>
<rect id="Rectangle_21" data-name="Rectangle 21" width="2.537" height="2.537" rx="1" transform="translate(27.994 0)" fill="#4a4a4a"/>
<rect id="Rectangle_22" data-name="Rectangle 22" width="2.537" height="2.537" rx="1" transform="translate(31 0)" fill="#4a4a4a"/>
<rect id="Rectangle_23" data-name="Rectangle 23" width="2.537" height="2.537" rx="1" transform="translate(34.006 0)" fill="#4a4a4a"/>
<rect id="Rectangle_24" data-name="Rectangle 24" width="2.537" height="2.537" rx="1" transform="translate(37.012 0)" fill="#4a4a4a"/>
<rect id="Rectangle_25" data-name="Rectangle 25" width="2.537" height="2.537" rx="1" transform="translate(40.018 0)" fill="#4a4a4a"/>
</g>
<g id="Group_4" data-name="Group 4" transform="translate(43.283 4.538) rotate(180)">
<path id="Path_53" data-name="Path 53" d="M.519,0H2.956a.519.519,0,0,1,.519.519v1.5a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,2.017V.519A.519.519,0,0,1,.519,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<rect id="Rectangle_26" data-name="Rectangle 26" width="2.537" height="2.537" rx="1" transform="translate(3.945 0)" fill="#4a4a4a"/>
<rect id="Rectangle_27" data-name="Rectangle 27" width="2.537" height="2.537" rx="1" transform="translate(6.951 0)" fill="#4a4a4a"/>
<rect id="Rectangle_28" data-name="Rectangle 28" width="2.537" height="2.537" rx="1" transform="translate(9.958 0)" fill="#4a4a4a"/>
<rect id="Rectangle_29" data-name="Rectangle 29" width="2.537" height="2.537" rx="1" transform="translate(12.964 0)" fill="#4a4a4a"/>
<rect id="Rectangle_30" data-name="Rectangle 30" width="2.537" height="2.537" rx="1" transform="translate(15.97 0)" fill="#4a4a4a"/>
<rect id="Rectangle_31" data-name="Rectangle 31" width="2.537" height="2.537" rx="1" transform="translate(18.976 0)" fill="#4a4a4a"/>
<rect id="Rectangle_32" data-name="Rectangle 32" width="2.537" height="2.537" rx="1" transform="translate(21.982 0)" fill="#4a4a4a"/>
<rect id="Rectangle_33" data-name="Rectangle 33" width="2.537" height="2.537" rx="1" transform="translate(24.988 0)" fill="#4a4a4a"/>
<rect id="Rectangle_34" data-name="Rectangle 34" width="2.537" height="2.537" rx="1" transform="translate(27.994 0)" fill="#4a4a4a"/>
<rect id="Rectangle_35" data-name="Rectangle 35" width="2.537" height="2.537" rx="1" transform="translate(31.001 0)" fill="#4a4a4a"/>
<rect id="Rectangle_36" data-name="Rectangle 36" width="2.537" height="2.537" rx="1" transform="translate(34.007 0)" fill="#4a4a4a"/>
<rect id="Rectangle_37" data-name="Rectangle 37" width="2.537" height="2.537" rx="1" transform="translate(37.013 0)" fill="#4a4a4a"/>
<rect id="Rectangle_38" data-name="Rectangle 38" width="2.537" height="2.537" rx="1" transform="translate(40.018 0)" fill="#4a4a4a"/>
<rect id="Rectangle_39" data-name="Rectangle 39" width="2.537" height="2.537" rx="1" transform="translate(3.945 0)" fill="#4a4a4a"/>
<rect id="Rectangle_40" data-name="Rectangle 40" width="2.537" height="2.537" rx="1" transform="translate(6.951 0)" fill="#4a4a4a"/>
<rect id="Rectangle_41" data-name="Rectangle 41" width="2.537" height="2.537" rx="1" transform="translate(9.958 0)" fill="#4a4a4a"/>
<rect id="Rectangle_42" data-name="Rectangle 42" width="2.537" height="2.537" rx="1" transform="translate(12.964 0)" fill="#4a4a4a"/>
<rect id="Rectangle_43" data-name="Rectangle 43" width="2.537" height="2.537" rx="1" transform="translate(15.97 0)" fill="#4a4a4a"/>
<rect id="Rectangle_44" data-name="Rectangle 44" width="2.537" height="2.537" rx="1" transform="translate(18.976 0)" fill="#4a4a4a"/>
<rect id="Rectangle_45" data-name="Rectangle 45" width="2.537" height="2.537" rx="1" transform="translate(21.982 0)" fill="#4a4a4a"/>
<rect id="Rectangle_46" data-name="Rectangle 46" width="2.537" height="2.537" rx="1" transform="translate(24.988 0)" fill="#4a4a4a"/>
<rect id="Rectangle_47" data-name="Rectangle 47" width="2.537" height="2.537" rx="1" transform="translate(27.994 0)" fill="#4a4a4a"/>
<rect id="Rectangle_48" data-name="Rectangle 48" width="2.537" height="2.537" rx="1" transform="translate(31.001 0)" fill="#4a4a4a"/>
<rect id="Rectangle_49" data-name="Rectangle 49" width="2.537" height="2.537" rx="1" transform="translate(34.007 0)" fill="#4a4a4a"/>
<rect id="Rectangle_50" data-name="Rectangle 50" width="2.537" height="2.537" rx="1" transform="translate(37.013 0)" fill="#4a4a4a"/>
<rect id="Rectangle_51" data-name="Rectangle 51" width="2.537" height="2.537" rx="1" transform="translate(40.018 0)" fill="#4a4a4a"/>
</g>
<g id="Group_6" data-name="Group 6" transform="translate(0.728 7.883)">
<path id="Path_54" data-name="Path 54" d="M.519,0h3.47a.519.519,0,0,1,.519.519v1.5a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,2.017V.52A.519.519,0,0,1,.519,0Z" transform="translate(0 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<g id="Group_5" data-name="Group 5" transform="translate(5.073 0)">
<rect id="Rectangle_52" data-name="Rectangle 52" width="2.537" height="2.537" rx="1" transform="translate(0 0)" fill="#4a4a4a"/>
<rect id="Rectangle_53" data-name="Rectangle 53" width="2.537" height="2.537" rx="1" transform="translate(3.006 0)" fill="#4a4a4a"/>
<rect id="Rectangle_54" data-name="Rectangle 54" width="2.537" height="2.537" rx="1" transform="translate(6.012 0)" fill="#4a4a4a"/>
<rect id="Rectangle_55" data-name="Rectangle 55" width="2.537" height="2.537" rx="1" transform="translate(9.018 0)" fill="#4a4a4a"/>
<rect id="Rectangle_56" data-name="Rectangle 56" width="2.537" height="2.537" rx="1" transform="translate(12.025 0)" fill="#4a4a4a"/>
<rect id="Rectangle_57" data-name="Rectangle 57" width="2.537" height="2.537" rx="1" transform="translate(15.031 0)" fill="#4a4a4a"/>
<rect id="Rectangle_58" data-name="Rectangle 58" width="2.537" height="2.537" rx="1" transform="translate(18.037 0)" fill="#4a4a4a"/>
<rect id="Rectangle_59" data-name="Rectangle 59" width="2.537" height="2.537" rx="1" transform="translate(21.042 0)" fill="#4a4a4a"/>
<rect id="Rectangle_60" data-name="Rectangle 60" width="2.537" height="2.537" rx="1" transform="translate(24.049 0)" fill="#4a4a4a"/>
<rect id="Rectangle_61" data-name="Rectangle 61" width="2.537" height="2.537" rx="1" transform="translate(27.055 0)" fill="#4a4a4a"/>
<rect id="Rectangle_62" data-name="Rectangle 62" width="2.537" height="2.537" rx="1" transform="translate(30.061 0)" fill="#4a4a4a"/>
</g>
<path id="Path_55" data-name="Path 55" d="M.52,0H3.8a.519.519,0,0,1,.519.519v1.5a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,2.017V.52A.519.519,0,0,1,.519,0Z" transform="translate(38.234 0)" fill="#4a4a4a" fill-rule="evenodd"/>
</g>
<g id="Group_7" data-name="Group 7" transform="translate(0.728 14.084)">
<rect id="Rectangle_63" data-name="Rectangle 63" width="2.537" height="2.537" rx="1" transform="translate(0 0)" fill="#4a4a4a"/>
<rect id="Rectangle_64" data-name="Rectangle 64" width="2.537" height="2.537" rx="1" transform="translate(3.006 0)" fill="#4a4a4a"/>
<rect id="Rectangle_65" data-name="Rectangle 65" width="2.537" height="2.537" rx="1" transform="translate(6.012 0)" fill="#4a4a4a"/>
<rect id="Rectangle_66" data-name="Rectangle 66" width="2.537" height="2.537" rx="1" transform="translate(9.018 0)" fill="#4a4a4a"/>
<path id="Path_56" data-name="Path 56" d="M.519,0H14.981A.519.519,0,0,1,15.5.519v1.5a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,2.018V.519A.519.519,0,0,1,.519,0Zm15.97,0h1.874a.519.519,0,0,1,.519.519v1.5a.519.519,0,0,1-.519.519H16.489a.519.519,0,0,1-.519-.519V.519A.519.519,0,0,1,16.489,0Z" transform="translate(12.024 0)" fill="#4a4a4a" fill-rule="evenodd"/>
<rect id="Rectangle_67" data-name="Rectangle 67" width="2.537" height="2.537" rx="1" transform="translate(31.376 0)" fill="#4a4a4a"/>
<rect id="Rectangle_68" data-name="Rectangle 68" width="2.537" height="2.537" rx="1" transform="translate(34.382 0)" fill="#4a4a4a"/>
<rect id="Rectangle_69" data-name="Rectangle 69" width="2.537" height="2.537" rx="1" transform="translate(40.018 0)" fill="#4a4a4a"/>
<path id="Path_57" data-name="Path 57" d="M2.537,0V.561a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,.561V0Z" transform="translate(39.736 1.08) rotate(180)" fill="#4a4a4a"/>
<path id="Path_58" data-name="Path 58" d="M2.537,0V.561a.519.519,0,0,1-.519.519H.519A.519.519,0,0,1,0,.561V0Z" transform="translate(37.2 1.456)" fill="#4a4a4a"/>
</g>
<rect id="Rectangle_70" data-name="Rectangle 70" width="42.273" height="1.127" rx="0.564" transform="translate(0.915 0.556)" fill="#4a4a4a"/>
<rect id="Rectangle_71" data-name="Rectangle 71" width="2.37" height="0.752" rx="0.376" transform="translate(1.949 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_72" data-name="Rectangle 72" width="2.37" height="0.752" rx="0.376" transform="translate(5.193 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_73" data-name="Rectangle 73" width="2.37" height="0.752" rx="0.376" transform="translate(7.688 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_74" data-name="Rectangle 74" width="2.37" height="0.752" rx="0.376" transform="translate(10.183 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_75" data-name="Rectangle 75" width="2.37" height="0.752" rx="0.376" transform="translate(12.679 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_76" data-name="Rectangle 76" width="2.37" height="0.752" rx="0.376" transform="translate(15.797 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_77" data-name="Rectangle 77" width="2.37" height="0.752" rx="0.376" transform="translate(18.292 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_78" data-name="Rectangle 78" width="2.37" height="0.752" rx="0.376" transform="translate(20.788 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_79" data-name="Rectangle 79" width="2.37" height="0.752" rx="0.376" transform="translate(23.283 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_80" data-name="Rectangle 80" width="2.37" height="0.752" rx="0.376" transform="translate(26.402 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_81" data-name="Rectangle 81" width="2.37" height="0.752" rx="0.376" transform="translate(28.897 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_82" data-name="Rectangle 82" width="2.37" height="0.752" rx="0.376" transform="translate(31.393 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_83" data-name="Rectangle 83" width="2.37" height="0.752" rx="0.376" transform="translate(34.512 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_84" data-name="Rectangle 84" width="2.37" height="0.752" rx="0.376" transform="translate(37.007 0.744)" fill="#d8d8d8" opacity="0.136"/>
<rect id="Rectangle_85" data-name="Rectangle 85" width="2.37" height="0.752" rx="0.376" transform="translate(39.502 0.744)" fill="#d8d8d8" opacity="0.136"/>
</g>
<path id="Path_59" data-name="Path 59" d="M123.779,148.389a2.583,2.583,0,0,0-.332.033c-.02-.078-.038-.156-.06-.234a2.594,2.594,0,1,0-2.567-4.455q-.086-.088-.174-.175a2.593,2.593,0,1,0-4.461-2.569c-.077-.022-.154-.04-.231-.06a2.6,2.6,0,1,0-5.128,0c-.077.02-.154.038-.231.06a2.594,2.594,0,1,0-4.461,2.569,10.384,10.384,0,1,0,17.314,9.992,2.592,2.592,0,1,0,.332-5.161" transform="translate(-51.054 -75.262)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_60" data-name="Path 60" d="M83,113.389h20.779V103H83Z" transform="translate(-41.443 -58.444)" fill="#3ecc5f" fill-rule="evenodd"/>
<path id="Path_61" data-name="Path 61" d="M123.389,108.944a1.3,1.3,0,1,0,0-2.6,1.338,1.338,0,0,0-.166.017c-.01-.039-.019-.078-.03-.117a1.3,1.3,0,0,0-.5-2.5,1.285,1.285,0,0,0-.783.269q-.043-.044-.087-.087a1.285,1.285,0,0,0,.263-.776,1.3,1.3,0,0,0-2.493-.509,5.195,5.195,0,1,0,0,10,1.3,1.3,0,0,0,2.493-.509,1.285,1.285,0,0,0-.263-.776q.044-.043.087-.087a1.285,1.285,0,0,0,.783.269,1.3,1.3,0,0,0,.5-2.5c.011-.038.02-.078.03-.117a1.335,1.335,0,0,0,.166.017" transform="translate(-55.859 -57.894)" fill="#44d860" fill-rule="evenodd"/>
<path id="Path_62" data-name="Path 62" d="M141.8,38.745a1.41,1.41,0,0,1-.255-.026,1.309,1.309,0,0,1-.244-.073,1.349,1.349,0,0,1-.224-.119,1.967,1.967,0,0,1-.2-.161,1.52,1.52,0,0,1-.161-.2,1.282,1.282,0,0,1-.218-.722,1.41,1.41,0,0,1,.026-.255,1.5,1.5,0,0,1,.072-.244,1.364,1.364,0,0,1,.12-.223,1.252,1.252,0,0,1,.358-.358,1.349,1.349,0,0,1,.224-.119,1.309,1.309,0,0,1,.244-.073,1.2,1.2,0,0,1,.509,0,1.262,1.262,0,0,1,.468.192,1.968,1.968,0,0,1,.2.161,1.908,1.908,0,0,1,.161.2,1.322,1.322,0,0,1,.12.223,1.361,1.361,0,0,1,.1.5,1.317,1.317,0,0,1-.379.919,1.968,1.968,0,0,1-.2.161,1.346,1.346,0,0,1-.223.119,1.332,1.332,0,0,1-.5.1m10.389-.649a1.326,1.326,0,0,1-.92-.379,1.979,1.979,0,0,1-.161-.2,1.282,1.282,0,0,1-.218-.722,1.326,1.326,0,0,1,.379-.919,1.967,1.967,0,0,1,.2-.161,1.351,1.351,0,0,1,.224-.119,1.308,1.308,0,0,1,.244-.073,1.2,1.2,0,0,1,.509,0,1.262,1.262,0,0,1,.468.192,1.967,1.967,0,0,1,.2.161,1.326,1.326,0,0,1,.379.919,1.461,1.461,0,0,1-.026.255,1.323,1.323,0,0,1-.073.244,1.847,1.847,0,0,1-.119.223,1.911,1.911,0,0,1-.161.2,1.967,1.967,0,0,1-.2.161,1.294,1.294,0,0,1-.722.218" transform="translate(-69.074 -26.006)" fill-rule="evenodd"/>
</g>
<g id="React-icon" transform="translate(906.3 541.56)">
<path id="Path_330" data-name="Path 330" d="M263.668,117.179c0-5.827-7.3-11.35-18.487-14.775,2.582-11.4,1.434-20.477-3.622-23.382a7.861,7.861,0,0,0-4.016-1v4a4.152,4.152,0,0,1,2.044.466c2.439,1.4,3.5,6.724,2.672,13.574-.2,1.685-.52,3.461-.914,5.272a86.9,86.9,0,0,0-11.386-1.954,87.469,87.469,0,0,0-7.459-8.965c5.845-5.433,11.332-8.41,15.062-8.41V78h0c-4.931,0-11.386,3.514-17.913,9.611-6.527-6.061-12.982-9.539-17.913-9.539v4c3.712,0,9.216,2.959,15.062,8.356a84.687,84.687,0,0,0-7.405,8.947,83.732,83.732,0,0,0-11.4,1.972c-.412-1.793-.717-3.532-.932-5.2-.843-6.85.2-12.175,2.618-13.592a3.991,3.991,0,0,1,2.062-.466v-4h0a8,8,0,0,0-4.052,1c-5.039,2.9-6.168,11.96-3.568,23.328-11.153,3.443-18.415,8.947-18.415,14.757,0,5.828,7.3,11.35,18.487,14.775-2.582,11.4-1.434,20.477,3.622,23.382a7.882,7.882,0,0,0,4.034,1c4.931,0,11.386-3.514,17.913-9.611,6.527,6.061,12.982,9.539,17.913,9.539a8,8,0,0,0,4.052-1c5.039-2.9,6.168-11.96,3.568-23.328C256.406,128.511,263.668,122.988,263.668,117.179Zm-23.346-11.96c-.663,2.313-1.488,4.7-2.421,7.083-.735-1.434-1.506-2.869-2.349-4.3-.825-1.434-1.7-2.833-2.582-4.2C235.517,104.179,237.974,104.645,240.323,105.219Zm-8.212,19.1c-1.4,2.421-2.833,4.716-4.321,6.85-2.672.233-5.379.359-8.1.359-2.708,0-5.415-.126-8.069-.341q-2.232-3.2-4.339-6.814-2.044-3.523-3.73-7.136c1.112-2.4,2.367-4.805,3.712-7.154,1.4-2.421,2.833-4.716,4.321-6.85,2.672-.233,5.379-.359,8.1-.359,2.708,0,5.415.126,8.069.341q2.232,3.2,4.339,6.814,2.044,3.523,3.73,7.136C234.692,119.564,233.455,121.966,232.11,124.315Zm5.792-2.331c.968,2.4,1.793,4.805,2.474,7.136-2.349.574-4.823,1.058-7.387,1.434.879-1.381,1.757-2.8,2.582-4.25C236.4,124.871,237.167,123.419,237.9,121.984ZM219.72,141.116a73.921,73.921,0,0,1-4.985-5.738c1.614.072,3.263.126,4.931.126,1.685,0,3.353-.036,4.985-.126A69.993,69.993,0,0,1,219.72,141.116ZM206.38,130.555c-2.546-.377-5-.843-7.352-1.417.663-2.313,1.488-4.7,2.421-7.083.735,1.434,1.506,2.869,2.349,4.3S205.5,129.192,206.38,130.555ZM219.63,93.241a73.924,73.924,0,0,1,4.985,5.738c-1.614-.072-3.263-.126-4.931-.126-1.686,0-3.353.036-4.985.126A69.993,69.993,0,0,1,219.63,93.241ZM206.362,103.8c-.879,1.381-1.757,2.8-2.582,4.25-.825,1.434-1.6,2.869-2.331,4.3-.968-2.4-1.793-4.805-2.474-7.136C201.323,104.663,203.8,104.179,206.362,103.8Zm-16.227,22.449c-6.348-2.708-10.454-6.258-10.454-9.073s4.106-6.383,10.454-9.073c1.542-.663,3.228-1.255,4.967-1.811a86.122,86.122,0,0,0,4.034,10.92,84.9,84.9,0,0,0-3.981,10.866C193.38,127.525,191.694,126.915,190.134,126.252Zm9.647,25.623c-2.439-1.4-3.5-6.724-2.672-13.574.2-1.686.52-3.461.914-5.272a86.9,86.9,0,0,0,11.386,1.954,87.465,87.465,0,0,0,7.459,8.965c-5.845,5.433-11.332,8.41-15.062,8.41A4.279,4.279,0,0,1,199.781,151.875Zm42.532-13.663c.843,6.85-.2,12.175-2.618,13.592a3.99,3.99,0,0,1-2.062.466c-3.712,0-9.216-2.959-15.062-8.356a84.689,84.689,0,0,0,7.405-8.947,83.731,83.731,0,0,0,11.4-1.972A50.194,50.194,0,0,1,242.313,138.212Zm6.9-11.96c-1.542.663-3.228,1.255-4.967,1.811a86.12,86.12,0,0,0-4.034-10.92,84.9,84.9,0,0,0,3.981-10.866c1.775.556,3.461,1.165,5.039,1.829,6.348,2.708,10.454,6.258,10.454,9.073C259.67,119.994,255.564,123.562,249.216,126.252Z" fill="#61dafb"/>
<path id="Path_331" data-name="Path 331" d="M320.8,78.4Z" transform="translate(-119.082 -0.328)" fill="#61dafb"/>
<circle id="Ellipse_112" data-name="Ellipse 112" cx="8.194" cy="8.194" r="8.194" transform="translate(211.472 108.984)" fill="#61dafb"/>
<path id="Path_332" data-name="Path 332" d="M520.5,78.1Z" transform="translate(-282.975 -0.082)" fill="#61dafb"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 35 KiB

View File

@ -0,0 +1,40 @@
<svg xmlns="http://www.w3.org/2000/svg" width="1129" height="663" viewBox="0 0 1129 663">
<title>Focus on What Matters</title>
<circle cx="321" cy="321" r="321" fill="#f2f2f2" />
<ellipse cx="559" cy="635.49998" rx="514" ry="27.50002" fill="#3f3d56" />
<ellipse cx="558" cy="627" rx="460" ry="22" opacity="0.2" />
<rect x="131" y="152.5" width="840" height="50" fill="#3f3d56" />
<path d="M166.5,727.3299A21.67009,21.67009,0,0,0,188.1701,749H984.8299A21.67009,21.67009,0,0,0,1006.5,727.3299V296h-840Z" transform="translate(-35.5 -118.5)" fill="#3f3d56" />
<path d="M984.8299,236H188.1701A21.67009,21.67009,0,0,0,166.5,257.6701V296h840V257.6701A21.67009,21.67009,0,0,0,984.8299,236Z" transform="translate(-35.5 -118.5)" fill="#3f3d56" />
<path d="M984.8299,236H188.1701A21.67009,21.67009,0,0,0,166.5,257.6701V296h840V257.6701A21.67009,21.67009,0,0,0,984.8299,236Z" transform="translate(-35.5 -118.5)" opacity="0.2" />
<circle cx="181" cy="147.5" r="13" fill="#3f3d56" />
<circle cx="217" cy="147.5" r="13" fill="#3f3d56" />
<circle cx="253" cy="147.5" r="13" fill="#3f3d56" />
<rect x="168" y="213.5" width="337" height="386" rx="5.33505" fill="#606060" />
<rect x="603" y="272.5" width="284" height="22" rx="5.47638" fill="#2e8555" />
<rect x="537" y="352.5" width="416" height="15" rx="5.47638" fill="#2e8555" />
<rect x="537" y="396.5" width="416" height="15" rx="5.47638" fill="#2e8555" />
<rect x="537" y="440.5" width="416" height="15" rx="5.47638" fill="#2e8555" />
<rect x="537" y="484.5" width="416" height="15" rx="5.47638" fill="#2e8555" />
<rect x="865" y="552.5" width="88" height="26" rx="7.02756" fill="#3ecc5f" />
<path d="M1088.60287,624.61594a30.11371,30.11371,0,0,0,3.98291-15.266c0-13.79652-8.54358-24.98081-19.08256-24.98081s-19.08256,11.18429-19.08256,24.98081a30.11411,30.11411,0,0,0,3.98291,15.266,31.248,31.248,0,0,0,0,30.53213,31.248,31.248,0,0,0,0,30.53208,31.248,31.248,0,0,0,0,30.53208,30.11408,30.11408,0,0,0-3.98291,15.266c0,13.79652,8.54353,24.98081,19.08256,24.98081s19.08256-11.18429,19.08256-24.98081a30.11368,30.11368,0,0,0-3.98291-15.266,31.248,31.248,0,0,0,0-30.53208,31.248,31.248,0,0,0,0-30.53208,31.248,31.248,0,0,0,0-30.53213Z" transform="translate(-35.5 -118.5)" fill="#3f3d56" />
<ellipse cx="1038.00321" cy="460.31783" rx="19.08256" ry="24.9808" fill="#3f3d56" />
<ellipse cx="1038.00321" cy="429.78574" rx="19.08256" ry="24.9808" fill="#3f3d56" />
<path d="M1144.93871,339.34489a91.61081,91.61081,0,0,0,7.10658-10.46092l-50.141-8.23491,54.22885.4033a91.566,91.566,0,0,0,1.74556-72.42605l-72.75449,37.74139,67.09658-49.32086a91.41255,91.41255,0,1,0-150.971,102.29805,91.45842,91.45842,0,0,0-10.42451,16.66946l65.0866,33.81447-69.40046-23.292a91.46011,91.46011,0,0,0,14.73837,85.83669,91.40575,91.40575,0,1,0,143.68892,0,91.41808,91.41808,0,0,0,0-113.02862Z" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M981.6885,395.8592a91.01343,91.01343,0,0,0,19.56129,56.51431,91.40575,91.40575,0,1,0,143.68892,0C1157.18982,436.82067,981.6885,385.60008,981.6885,395.8592Z" transform="translate(-35.5 -118.5)" opacity="0.1" />
<path d="M365.62,461.43628H477.094v45.12043H365.62Z" transform="translate(-35.5 -118.5)" fill="#fff" fill-rule="evenodd" />
<path d="M264.76252,608.74122a26.50931,26.50931,0,0,1-22.96231-13.27072,26.50976,26.50976,0,0,0,22.96231,39.81215H291.304V608.74122Z" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M384.17242,468.57061l92.92155-5.80726V449.49263a26.54091,26.54091,0,0,0-26.54143-26.54143H331.1161l-3.31768-5.74622a3.83043,3.83043,0,0,0-6.63536,0l-3.31768,5.74622-3.31767-5.74622a3.83043,3.83043,0,0,0-6.63536,0l-3.31768,5.74622L301.257,417.205a3.83043,3.83043,0,0,0-6.63536,0L291.304,422.9512c-.02919,0-.05573.004-.08625.004l-5.49674-5.49541a3.8293,3.8293,0,0,0-6.4071,1.71723l-1.81676,6.77338L270.607,424.1031a3.82993,3.82993,0,0,0-4.6912,4.69253l1.84463,6.89148-6.77072,1.81411a3.8315,3.8315,0,0,0-1.71988,6.40975l5.49673,5.49673c0,.02787-.004.05574-.004.08493l-5.74622,3.31768a3.83043,3.83043,0,0,0,0,6.63536l5.74621,3.31768L259.0163,466.081a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768-5.74622,3.31767a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768-5.74622,3.31768a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768-5.74622,3.31767a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768-5.74622,3.31768a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768-5.74622,3.31768a3.83042,3.83042,0,0,0,0,6.63535l5.74622,3.31768-5.74622,3.31768a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768L259.0163,558.976a3.83042,3.83042,0,0,0,0,6.63535l5.74622,3.31768-5.74622,3.31768a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768-5.74622,3.31768a3.83042,3.83042,0,0,0,0,6.63535l5.74622,3.31768-5.74622,3.31768a3.83043,3.83043,0,0,0,0,6.63536l5.74622,3.31768A26.54091,26.54091,0,0,0,291.304,635.28265H450.55254A26.5409,26.5409,0,0,0,477.094,608.74122V502.5755l-92.92155-5.80727a14.12639,14.12639,0,0,1,0-28.19762" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M424.01111,635.28265h39.81214V582.19979H424.01111Z" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M490.36468,602.10586a6.60242,6.60242,0,0,0-.848.08493c-.05042-.19906-.09821-.39945-.15393-.59852A6.62668,6.62668,0,1,0,482.80568,590.21q-.2203-.22491-.44457-.44589a6.62391,6.62391,0,1,0-11.39689-6.56369c-.1964-.05575-.39414-.10218-.59056-.15262a6.63957,6.63957,0,1,0-13.10086,0c-.1964.05042-.39414.09687-.59056.15262a6.62767,6.62767,0,1,0-11.39688,6.56369,26.52754,26.52754,0,1,0,44.23127,25.52756,6.6211,6.6211,0,1,0,.848-13.18579" transform="translate(-35.5 -118.5)" fill="#44d860" fill-rule="evenodd" />
<path d="M437.28182,555.65836H477.094V529.11693H437.28182Z" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M490.36468,545.70532a3.31768,3.31768,0,0,0,0-6.63536,3.41133,3.41133,0,0,0-.42333.04247c-.02655-.09953-.04911-.19907-.077-.29859a3.319,3.319,0,0,0-1.278-6.37923,3.28174,3.28174,0,0,0-2.00122.68742q-.10947-.11346-.22294-.22295a3.282,3.282,0,0,0,.67149-1.98265,3.31768,3.31768,0,0,0-6.37-1.2992,13.27078,13.27078,0,1,0,0,25.54082,3.31768,3.31768,0,0,0,6.37-1.2992,3.282,3.282,0,0,0-.67149-1.98265q.11347-.10947.22294-.22294a3.28174,3.28174,0,0,0,2.00122.68742,3.31768,3.31768,0,0,0,1.278-6.37923c.02786-.0982.05042-.19907.077-.29859a3.41325,3.41325,0,0,0,.42333.04246" transform="translate(-35.5 -118.5)" fill="#44d860" fill-rule="evenodd" />
<path d="M317.84538,466.081a3.31768,3.31768,0,0,1-3.31767-3.31768,9.953,9.953,0,1,0-19.90608,0,3.31768,3.31768,0,1,1-6.63535,0,16.58839,16.58839,0,1,1,33.17678,0,3.31768,3.31768,0,0,1-3.31768,3.31768" transform="translate(-35.5 -118.5)" fill-rule="evenodd" />
<path d="M370.92825,635.28265h79.62429A26.5409,26.5409,0,0,0,477.094,608.74122v-92.895H397.46968a26.54091,26.54091,0,0,0-26.54143,26.54143Z" transform="translate(-35.5 -118.5)" fill="#ffff50" fill-rule="evenodd" />
<path d="M457.21444,556.98543H390.80778a1.32707,1.32707,0,0,1,0-2.65414h66.40666a1.32707,1.32707,0,0,1,0,2.65414m0,26.54143H390.80778a1.32707,1.32707,0,1,1,0-2.65414h66.40666a1.32707,1.32707,0,0,1,0,2.65414m0,26.54143H390.80778a1.32707,1.32707,0,1,1,0-2.65414h66.40666a1.32707,1.32707,0,0,1,0,2.65414m0-66.10674H390.80778a1.32707,1.32707,0,0,1,0-2.65414h66.40666a1.32707,1.32707,0,0,1,0,2.65414m0,26.29459H390.80778a1.32707,1.32707,0,0,1,0-2.65414h66.40666a1.32707,1.32707,0,0,1,0,2.65414m0,26.54143H390.80778a1.32707,1.32707,0,0,1,0-2.65414h66.40666a1.32707,1.32707,0,0,1,0,2.65414M477.094,474.19076c-.01592,0-.0292-.008-.04512-.00663-4.10064.13934-6.04083,4.24132-7.75274,7.86024-1.78623,3.78215-3.16771,6.24122-5.43171,6.16691-2.50685-.09024-3.94007-2.92222-5.45825-5.91874-1.74377-3.44243-3.73438-7.34667-7.91333-7.20069-4.04227.138-5.98907,3.70784-7.70631,6.857-1.82738,3.35484-3.07084,5.39455-5.46887,5.30033-2.55727-.09289-3.91619-2.39536-5.48877-5.06013-1.75306-2.96733-3.77951-6.30359-7.8775-6.18946-3.97326.13669-5.92537,3.16507-7.64791,5.83912-1.82207,2.82666-3.09872,4.5492-5.52725,4.447-2.61832-.09289-3.9706-2.00388-5.53522-4.21611-1.757-2.4856-3.737-5.299-7.82308-5.16231-3.88567.13271-5.83779,2.61434-7.559,4.80135-1.635,2.07555-2.9116,3.71846-5.61218,3.615a1.32793,1.32793,0,1,0-.09555,2.65414c4.00377.134,6.03154-2.38873,7.79257-4.6275,1.562-1.9853,2.91027-3.69855,5.56441-3.78879,2.55594-.10882,3.75429,1.47968,5.56707,4.04093,1.7212,2.43385,3.67465,5.19416,7.60545,5.33616,4.11789.138,6.09921-2.93946,7.8536-5.66261,1.56861-2.43385,2.92221-4.53461,5.50734-4.62352,2.37944-.08892,3.67466,1.79154,5.50072,4.885,1.72121,2.91557,3.67069,6.21865,7.67977,6.36463,4.14709.14332,6.14965-3.47693,7.89475-6.68181,1.51155-2.77092,2.93814-5.38791,5.46621-5.4755,2.37944-.05573,3.62025,2.11668,5.45558,5.74622,1.71459,3.388,3.65875,7.22591,7.73019,7.37321l.22429.004c4.06614,0,5.99571-4.08074,7.70364-7.68905,1.51154-3.19825,2.94211-6.21069,5.3972-6.33411Z" transform="translate(-35.5 -118.5)" fill-rule="evenodd" />
<path d="M344.38682,635.28265h53.08286V582.19979H344.38682Z" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M424.01111,602.10586a6.60242,6.60242,0,0,0-.848.08493c-.05042-.19906-.09821-.39945-.15394-.59852A6.62667,6.62667,0,1,0,416.45211,590.21q-.2203-.22491-.44458-.44589a6.62391,6.62391,0,1,0-11.39689-6.56369c-.1964-.05575-.39413-.10218-.59054-.15262a6.63957,6.63957,0,1,0-13.10084,0c-.19641.05042-.39414.09687-.59055.15262a6.62767,6.62767,0,1,0-11.39689,6.56369,26.52755,26.52755,0,1,0,44.2313,25.52756,6.6211,6.6211,0,1,0,.848-13.18579" transform="translate(-35.5 -118.5)" fill="#44d860" fill-rule="evenodd" />
<path d="M344.38682,555.65836h53.08286V529.11693H344.38682Z" transform="translate(-35.5 -118.5)" fill="#3ecc5f" fill-rule="evenodd" />
<path d="M410.74039,545.70532a3.31768,3.31768,0,1,0,0-6.63536,3.41133,3.41133,0,0,0-.42333.04247c-.02655-.09953-.04911-.19907-.077-.29859a3.319,3.319,0,0,0-1.278-6.37923,3.28174,3.28174,0,0,0-2.00122.68742q-.10947-.11346-.22294-.22295a3.282,3.282,0,0,0,.67149-1.98265,3.31768,3.31768,0,0,0-6.37-1.2992,13.27078,13.27078,0,1,0,0,25.54082,3.31768,3.31768,0,0,0,6.37-1.2992,3.282,3.282,0,0,0-.67149-1.98265q.11347-.10947.22294-.22294a3.28174,3.28174,0,0,0,2.00122.68742,3.31768,3.31768,0,0,0,1.278-6.37923c.02786-.0982.05042-.19907.077-.29859a3.41325,3.41325,0,0,0,.42333.04246" transform="translate(-35.5 -118.5)" fill="#44d860" fill-rule="evenodd" />
<path d="M424.01111,447.8338a3.60349,3.60349,0,0,1-.65028-.06636,3.34415,3.34415,0,0,1-.62372-.18579,3.44679,3.44679,0,0,1-.572-.30522,5.02708,5.02708,0,0,1-.50429-.4114,3.88726,3.88726,0,0,1-.41007-.50428,3.27532,3.27532,0,0,1-.55737-1.84463,3.60248,3.60248,0,0,1,.06636-.65027,3.82638,3.82638,0,0,1,.18447-.62373,3.48858,3.48858,0,0,1,.30656-.57064,3.197,3.197,0,0,1,.91436-.91568,3.44685,3.44685,0,0,1,.572-.30523,3.344,3.344,0,0,1,.62372-.18578,3.06907,3.06907,0,0,1,1.30053,0,3.22332,3.22332,0,0,1,1.19436.491,5.02835,5.02835,0,0,1,.50429.41139,4.8801,4.8801,0,0,1,.41139.50429,3.38246,3.38246,0,0,1,.30522.57064,3.47806,3.47806,0,0,1,.25215,1.274A3.36394,3.36394,0,0,1,426.36,446.865a5.02708,5.02708,0,0,1-.50429.4114,3.3057,3.3057,0,0,1-1.84463.55737m26.54143-1.65884a3.38754,3.38754,0,0,1-2.35024-.96877,5.04185,5.04185,0,0,1-.41007-.50428,3.27532,3.27532,0,0,1-.55737-1.84463,3.38659,3.38659,0,0,1,.96744-2.34892,5.02559,5.02559,0,0,1,.50429-.41139,3.44685,3.44685,0,0,1,.572-.30523,3.3432,3.3432,0,0,1,.62373-.18579,3.06952,3.06952,0,0,1,1.30052,0,3.22356,3.22356,0,0,1,1.19436.491,5.02559,5.02559,0,0,1,.50429.41139,3.38792,3.38792,0,0,1,.96876,2.34892,3.72635,3.72635,0,0,1-.06636.65026,3.37387,3.37387,0,0,1-.18579.62373,4.71469,4.71469,0,0,1-.30522.57064,4.8801,4.8801,0,0,1-.41139.50429,5.02559,5.02559,0,0,1-.50429.41139,3.30547,3.30547,0,0,1-1.84463.55737" transform="translate(-35.5 -118.5)" fill-rule="evenodd" />
</svg>

After

Width:  |  Height:  |  Size: 12 KiB

8
docs/tsconfig.json Normal file
View File

@ -0,0 +1,8 @@
{
// This file is not used in compilation. It is here just for a nice editor experience.
"extends": "@docusaurus/tsconfig",
"compilerOptions": {
"baseUrl": "."
},
"exclude": [".docusaurus", "build"]
}

View File

@ -22,20 +22,21 @@ classifiers = [
"Topic :: Software Development :: Libraries :: Python Modules"
]
dependencies = [
"langchain-anthropic>=0.3.1",
"langchain-openai",
"langchain-google-genai",
"langgraph>=0.2.70",
"langgraph-checkpoint>=2.0.9",
"langgraph-sdk>=0.1.48",
"langchain-core>=0.3.34",
"langchain-anthropic>=0.3.7",
"langchain-openai>=0.3.5",
"langchain-google-genai>=2.0.9",
"langgraph>=0.2.71",
"langgraph-checkpoint>=2.0.12",
"langgraph-sdk>=0.1.51",
"langchain-core>=0.3.35",
"langchain>=0.3.18",
"rich>=13.0.0",
"GitPython>=3.1",
"fuzzywuzzy==0.18.0",
"rapidfuzz>=3.11.0",
"pathspec>=0.11.0",
"aider-chat>=0.73.0",
"pyte>=0.8.2",
"aider-chat>=0.72.0",
"tavily-python>=0.5.0",
"litellm",
"fastapi>=0.104.0",

View File

@ -7,6 +7,7 @@ from datetime import datetime
from langgraph.checkpoint.memory import MemorySaver
from rich.console import Console
from rich.panel import Panel
from rich.text import Text
from ra_aid import print_error, print_stage_header
from ra_aid.__version__ import __version__
@ -113,9 +114,9 @@ Examples:
parser.add_argument(
"--expert-provider",
type=str,
default="openai",
default=None,
choices=VALID_PROVIDERS,
help="The LLM provider to use for expert knowledge queries (default: openai)",
help="The LLM provider to use for expert knowledge queries",
)
parser.add_argument(
"--expert-model",
@ -234,10 +235,18 @@ Examples:
# Handle expert provider/model defaults
if not parsed_args.expert_provider:
# If no expert provider specified, use main provider instead of defaulting to
# to any particular model since we do not know if we have access to any other model.
parsed_args.expert_provider = parsed_args.provider
parsed_args.expert_model = parsed_args.model
# Check for OpenAI API key first
if os.environ.get("OPENAI_API_KEY"):
parsed_args.expert_provider = "openai"
parsed_args.expert_model = None # Will be auto-selected
# If no OpenAI key but DeepSeek key exists, use DeepSeek
elif os.environ.get("DEEPSEEK_API_KEY"):
parsed_args.expert_provider = "deepseek"
parsed_args.expert_model = "deepseek-reasoner"
else:
# Fall back to main provider if neither is available
parsed_args.expert_provider = parsed_args.provider
parsed_args.expert_model = parsed_args.model
# Validate temperature range if provided
if parsed_args.temperature is not None and not (
@ -299,27 +308,55 @@ def main():
) # Will exit if main env vars missing
logger.debug("Environment validation successful")
if expert_missing:
console.print(
Panel(
"[yellow]Expert tools disabled due to missing configuration:[/yellow]\n"
+ "\n".join(f"- {m}" for m in expert_missing)
+ "\nSet the required environment variables or args to enable expert mode.",
title="Expert Tools Disabled",
style="yellow",
# Validate model configuration early
from ra_aid.models_params import models_params
model_config = models_params.get(args.provider, {}).get(args.model or "", {})
supports_temperature = model_config.get(
"supports_temperature",
args.provider
in ["anthropic", "openai", "openrouter", "openai-compatible", "deepseek"],
)
if supports_temperature and args.temperature is None:
args.temperature = model_config.get("default_temperature")
if args.temperature is None:
print_error(
f"Temperature must be provided for model {args.model} which supports temperature"
)
sys.exit(1)
logger.debug(
f"Using default temperature {args.temperature} for model {args.model}"
)
if web_research_missing:
console.print(
Panel(
"[yellow]Web research disabled due to missing configuration:[/yellow]\n"
+ "\n".join(f"- {m}" for m in web_research_missing)
+ "\nSet the required environment variables to enable web research.",
title="Web Research Disabled",
style="yellow",
)
)
# Display status lines
status = Text()
# Model info
status.append("🤖 ")
status.append(f"{args.provider}/{args.model}")
if args.temperature is not None:
status.append(f" @ T{args.temperature}")
status.append("\n")
# Expert info
status.append("🤔 ")
if expert_enabled:
status.append(f"{args.expert_provider}/{args.expert_model}")
else:
status.append("Expert: ")
status.append("Disabled", style="italic")
status.append("\n")
# Search info
status.append("🔍 Search: ")
status.append(
"Enabled" if web_research_enabled else "Disabled",
style=None if web_research_enabled else "italic",
)
console.print(
Panel(status, title="Config", border_style="bright_blue", padding=(0, 1))
)
# Handle chat mode
if args.chat:
@ -368,6 +405,7 @@ def main():
_global_memory["config"]["model"] = args.model
_global_memory["config"]["expert_provider"] = args.expert_provider
_global_memory["config"]["expert_model"] = args.expert_model
_global_memory["config"]["temperature"] = args.temperature
# Create chat agent with appropriate tools
chat_agent = create_agent(
@ -442,6 +480,8 @@ def main():
# Store fallback tool configuration
_global_memory["config"]["no_fallback_tool"] = args.no_fallback_tool
# Store temperature in global config
_global_memory["config"]["temperature"] = args.temperature
# Run research stage
print_stage_header("Research Stage")

View File

@ -1,3 +1,3 @@
"""Version information."""
__version__ = "0.13.2"
__version__ = "0.14.1"

View File

@ -10,8 +10,8 @@ from langchain_openai import ChatOpenAI
class ChatDeepseekReasoner(ChatOpenAI):
"""ChatDeepseekReasoner with custom overrides for R1/reasoner models."""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def __init__(self, *args, timeout: int = 180, max_retries: int = 5, **kwargs):
super().__init__(*args, timeout=timeout, max_retries=max_retries, **kwargs)
def invocation_params(
self, options: Optional[Dict[str, Any]] = None, **kwargs: Any

View File

@ -1,5 +1,6 @@
"""Module for efficient file listing using git."""
import os
import subprocess
from pathlib import Path
from typing import List, Optional, Tuple
@ -70,7 +71,7 @@ def is_git_repo(directory: str) -> bool:
def get_file_listing(
directory: str, limit: Optional[int] = None
directory: str, limit: Optional[int] = None, include_hidden: bool = False
) -> Tuple[List[str], int]:
"""
Get a list of tracked files in a git repository.
@ -82,6 +83,7 @@ def get_file_listing(
Args:
directory: Path to the git repository
limit: Optional maximum number of files to return
include_hidden: Whether to include hidden files (starting with .) in the results
Returns:
Tuple[List[str], int]: Tuple containing:
@ -95,42 +97,72 @@ def get_file_listing(
FileListerError: For other unexpected errors
"""
try:
# Check if directory is a git repo first
# Check if directory exists and is accessible
if not os.path.exists(directory):
raise DirectoryNotFoundError(f"Directory not found: {directory}")
if not os.path.isdir(directory):
raise DirectoryNotFoundError(f"Not a directory: {directory}")
# Check if it's a git repository
if not is_git_repo(directory):
return [], 0
# Run git ls-files
result = subprocess.run(
["git", "ls-files"],
cwd=directory,
capture_output=True,
text=True,
check=True,
)
# Get list of files from git ls-files
try:
# Get both tracked and untracked files
tracked_files_process = subprocess.run(
["git", "ls-files"],
cwd=directory,
capture_output=True,
text=True,
check=True,
)
untracked_files_process = subprocess.run(
["git", "ls-files", "--others", "--exclude-standard"],
cwd=directory,
capture_output=True,
text=True,
check=True,
)
except subprocess.CalledProcessError as e:
raise GitCommandError(f"Git command failed: {e}")
except PermissionError as e:
raise DirectoryAccessError(f"Permission denied: {e}")
# Process the output
files = [line.strip() for line in result.stdout.splitlines() if line.strip()]
# Combine and process the files
all_files = []
for file in (
tracked_files_process.stdout.splitlines()
+ untracked_files_process.stdout.splitlines()
):
file = file.strip()
if not file:
continue
# Skip hidden files unless explicitly included
if not include_hidden and (
file.startswith(".")
or any(part.startswith(".") for part in file.split("/"))
):
continue
# Skip .aider files
if ".aider" in file:
continue
all_files.append(file)
# Deduplicate and sort for consistency
files = list(dict.fromkeys(files)) # Remove duplicates while preserving order
# Remove duplicates and sort
all_files = sorted(set(all_files))
total_count = len(all_files)
# Sort for consistency
files.sort()
# Get total count before truncation
total_count = len(files)
# Truncate if limit specified
# Apply limit if specified
if limit is not None:
files = files[:limit]
all_files = all_files[:limit]
return files, total_count
return all_files, total_count
except subprocess.CalledProcessError as e:
raise GitCommandError(f"Git command failed: {e}")
except (DirectoryNotFoundError, DirectoryAccessError, GitCommandError):
# Re-raise known exceptions
raise
except PermissionError as e:
raise DirectoryAccessError(f"Cannot access directory {directory}: {e}")
raise DirectoryAccessError(f"Permission denied: {e}")
except Exception as e:
if isinstance(e, FileListerError):
raise
raise FileListerError(f"Error listing files: {e}")
raise FileListerError(f"Unexpected error: {e}")

View File

@ -6,12 +6,57 @@ from langchain_core.language_models import BaseChatModel
from langchain_core.messages import BaseMessage
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_openai import ChatOpenAI
from openai import OpenAI
from ra_aid.chat_models.deepseek_chat import ChatDeepseekReasoner
from ra_aid.logging_config import get_logger
from .models_params import models_params
def get_available_openai_models() -> List[str]:
"""Fetch available OpenAI models using OpenAI client.
Returns:
List of available model names
"""
try:
# Use OpenAI client to fetch models
client = OpenAI()
models = client.models.list()
return [str(model.id) for model in models.data]
except Exception:
# Return empty list if unable to fetch models
return []
def select_expert_model(provider: str, model: Optional[str] = None) -> Optional[str]:
"""Select appropriate expert model based on provider and availability.
Args:
provider: The LLM provider
model: Optional explicitly specified model name
Returns:
Selected model name or None if no suitable model found
"""
if provider != "openai" or model is not None:
return model
# Try to get available models
available_models = get_available_openai_models()
# Priority order for expert models
priority_models = ["o3-mini", "o1", "o1-preview"]
# Return first available model from priority list
for model_name in priority_models:
if model_name in available_models:
return model_name
return None
known_temp_providers = {
"openai",
"anthropic",
@ -21,6 +66,10 @@ known_temp_providers = {
"deepseek",
}
# Constants for API request configuration
LLM_REQUEST_TIMEOUT = 180
LLM_MAX_RETRIES = 5
logger = get_logger(__name__)
@ -52,6 +101,8 @@ def create_deepseek_client(
0 if is_expert else (temperature if temperature is not None else 1)
),
model=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
)
return ChatOpenAI(
@ -59,6 +110,8 @@ def create_deepseek_client(
base_url=base_url,
temperature=0 if is_expert else (temperature if temperature is not None else 1),
model=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
)
@ -77,12 +130,16 @@ def create_openrouter_client(
0 if is_expert else (temperature if temperature is not None else 1)
),
model=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
)
return ChatOpenAI(
api_key=api_key,
base_url="https://openrouter.ai/api/v1",
model=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
**({"temperature": temperature} if temperature is not None else {}),
)
@ -144,6 +201,11 @@ def create_llm_client(
if not config:
raise ValueError(f"Unsupported provider: {provider}")
if is_expert and provider == "openai":
model_name = select_expert_model(provider, model_name)
if not model_name:
raise ValueError("No suitable expert model available")
logger.debug(
"Creating LLM client with provider=%s, model=%s, temperature=%s, expert=%s",
provider,
@ -164,10 +226,12 @@ def create_llm_client(
# Handle temperature settings
if is_expert:
temp_kwargs = {"temperature": 0} if supports_temperature else {}
elif temperature is not None and supports_temperature:
elif supports_temperature:
if temperature is None:
raise ValueError(
f"Temperature must be provided for model {model_name} which supports temperature"
)
temp_kwargs = {"temperature": temperature}
elif provider == "openai-compatible" and supports_temperature:
temp_kwargs = {"temperature": 0.3}
else:
temp_kwargs = {}
@ -194,11 +258,19 @@ def create_llm_client(
}
if is_expert:
openai_kwargs["reasoning_effort"] = "high"
return ChatOpenAI(**openai_kwargs)
return ChatOpenAI(
**{
**openai_kwargs,
"timeout": LLM_REQUEST_TIMEOUT,
"max_retries": LLM_MAX_RETRIES,
}
)
elif provider == "anthropic":
return ChatAnthropic(
api_key=config["api_key"],
model_name=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
**temp_kwargs,
)
elif provider == "openai-compatible":
@ -206,12 +278,16 @@ def create_llm_client(
api_key=config["api_key"],
base_url=config["base_url"],
model=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
**temp_kwargs,
)
elif provider == "gemini":
return ChatGoogleGenerativeAI(
api_key=config["api_key"],
model=model_name,
timeout=LLM_REQUEST_TIMEOUT,
max_retries=LLM_MAX_RETRIES,
**temp_kwargs,
)
else:

View File

@ -3,329 +3,870 @@ List of model parameters
"""
DEFAULT_TOKEN_LIMIT = 100000
DEFAULT_TEMPERATURE = 0.7
models_params = {
"openai": {
"gpt-3.5-turbo-0125": {"token_limit": 16385, "supports_temperature": True},
"gpt-3.5": {"token_limit": 4096, "supports_temperature": True},
"gpt-3.5-turbo": {"token_limit": 16385, "supports_temperature": True},
"gpt-3.5-turbo-1106": {"token_limit": 16385, "supports_temperature": True},
"gpt-3.5-turbo-instruct": {"token_limit": 4096, "supports_temperature": True},
"gpt-4-0125-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-turbo-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-turbo": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-turbo-2024-04-09": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-1106-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-vision-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4": {"token_limit": 8192, "supports_temperature": True},
"gpt-4-0613": {"token_limit": 8192, "supports_temperature": True},
"gpt-4-32k": {"token_limit": 32768, "supports_temperature": True},
"gpt-4-32k-0613": {"token_limit": 32768, "supports_temperature": True},
"gpt-4o": {"token_limit": 128000, "supports_temperature": True},
"gpt-4o-2024-08-06": {"token_limit": 128000, "supports_temperature": True},
"gpt-4o-2024-05-13": {"token_limit": 128000, "supports_temperature": True},
"gpt-4o-mini": {"token_limit": 128000, "supports_temperature": True},
"gpt-3.5-turbo-0125": {
"token_limit": 16385,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5-turbo": {
"token_limit": 16385,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5-turbo-1106": {
"token_limit": 16385,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5-turbo-instruct": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-0125-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-turbo-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-turbo-2024-04-09": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-1106-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-vision-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-0613": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-32k": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-32k-0613": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4o": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4o-2024-08-06": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4o-2024-05-13": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4o-mini": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"o1-preview": {"token_limit": 128000, "supports_temperature": False},
"o1-mini": {"token_limit": 128000, "supports_temperature": False},
"o1": {"token_limit": 200000, "supports_temperature": False},
"o3-mini": {"token_limit": 200000, "supports_temperature": False},
},
"azure_openai": {
"gpt-3.5-turbo-0125": {"token_limit": 16385, "supports_temperature": True},
"gpt-3.5": {"token_limit": 4096, "supports_temperature": True},
"gpt-3.5-turbo": {"token_limit": 16385, "supports_temperature": True},
"gpt-3.5-turbo-1106": {"token_limit": 16385, "supports_temperature": True},
"gpt-3.5-turbo-instruct": {"token_limit": 4096, "supports_temperature": True},
"gpt-4-0125-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-turbo-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-turbo": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-turbo-2024-04-09": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-1106-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4-vision-preview": {"token_limit": 128000, "supports_temperature": True},
"gpt-4": {"token_limit": 8192, "supports_temperature": True},
"gpt-4-0613": {"token_limit": 8192, "supports_temperature": True},
"gpt-4-32k": {"token_limit": 32768, "supports_temperature": True},
"gpt-4-32k-0613": {"token_limit": 32768, "supports_temperature": True},
"gpt-4o": {"token_limit": 128000, "supports_temperature": True},
"gpt-4o-mini": {"token_limit": 128000, "supports_temperature": True},
"chatgpt-4o-latest": {"token_limit": 128000, "supports_temperature": True},
"gpt-3.5-turbo-0125": {
"token_limit": 16385,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5-turbo": {
"token_limit": 16385,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5-turbo-1106": {
"token_limit": 16385,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-3.5-turbo-instruct": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-0125-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-turbo-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-turbo-2024-04-09": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-1106-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-vision-preview": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-0613": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-32k": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4-32k-0613": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4o": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gpt-4o-mini": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"chatgpt-4o-latest": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"o1-preview": {"token_limit": 128000, "supports_temperature": False},
"o1-mini": {"token_limit": 128000, "supports_temperature": False},
},
"google_genai": {
"gemini-pro": {"token_limit": 128000, "supports_temperature": True},
"gemini-pro": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemini-1.5-flash-latest": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemini-1.5-pro-latest": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"models/embedding-001": {
"token_limit": 2048,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemini-1.5-pro-latest": {"token_limit": 128000, "supports_temperature": True},
"models/embedding-001": {"token_limit": 2048, "supports_temperature": True},
},
"google_vertexai": {
"gemini-1.5-flash": {"token_limit": 128000, "supports_temperature": True},
"gemini-1.5-pro": {"token_limit": 128000, "supports_temperature": True},
"gemini-1.0-pro": {"token_limit": 128000, "supports_temperature": True},
"gemini-1.5-flash": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemini-1.5-pro": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemini-1.0-pro": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
},
"ollama": {
"command-r": {"token_limit": 12800, "supports_temperature": True},
"codellama": {"token_limit": 16000, "supports_temperature": True},
"dbrx": {"token_limit": 32768, "supports_temperature": True},
"deepseek-coder:33b": {"token_limit": 16000, "supports_temperature": True},
"falcon": {"token_limit": 2048, "supports_temperature": True},
"llama2": {"token_limit": 4096, "supports_temperature": True},
"llama2:7b": {"token_limit": 4096, "supports_temperature": True},
"llama2:13b": {"token_limit": 4096, "supports_temperature": True},
"llama2:70b": {"token_limit": 4096, "supports_temperature": True},
"llama3": {"token_limit": 8192, "supports_temperature": True},
"llama3:8b": {"token_limit": 8192, "supports_temperature": True},
"llama3:70b": {"token_limit": 8192, "supports_temperature": True},
"llama3.1": {"token_limit": 128000, "supports_temperature": True},
"llama3.1:8b": {"token_limit": 128000, "supports_temperature": True},
"llama3.1:70b": {"token_limit": 128000, "supports_temperature": True},
"lama3.1:405b": {"token_limit": 128000, "supports_temperature": True},
"llama3.2": {"token_limit": 128000, "supports_temperature": True},
"llama3.2:1b": {"token_limit": 128000, "supports_temperature": True},
"llama3.2:3b": {"token_limit": 128000, "supports_temperature": True},
"llama3.3:70b": {"token_limit": 128000, "supports_temperature": True},
"scrapegraph": {"token_limit": 8192, "supports_temperature": True},
"mistral-small": {"token_limit": 128000, "supports_temperature": True},
"mistral-openorca": {"token_limit": 32000, "supports_temperature": True},
"mistral-large": {"token_limit": 128000, "supports_temperature": True},
"grok-1": {"token_limit": 8192, "supports_temperature": True},
"llava": {"token_limit": 4096, "supports_temperature": True},
"mixtral:8x22b-instruct": {"token_limit": 65536, "supports_temperature": True},
"nomic-embed-text": {"token_limit": 8192, "supports_temperature": True},
"nous-hermes2:34b": {"token_limit": 4096, "supports_temperature": True},
"orca-mini": {"token_limit": 2048, "supports_temperature": True},
"phi3:3.8b": {"token_limit": 12800, "supports_temperature": True},
"phi3:14b": {"token_limit": 128000, "supports_temperature": True},
"qwen:0.5b": {"token_limit": 32000, "supports_temperature": True},
"qwen:1.8b": {"token_limit": 32000, "supports_temperature": True},
"qwen:4b": {"token_limit": 32000, "supports_temperature": True},
"qwen:14b": {"token_limit": 32000, "supports_temperature": True},
"qwen:32b": {"token_limit": 32000, "supports_temperature": True},
"qwen:72b": {"token_limit": 32000, "supports_temperature": True},
"qwen:110b": {"token_limit": 32000, "supports_temperature": True},
"stablelm-zephyr": {"token_limit": 8192, "supports_temperature": True},
"wizardlm2:8x22b": {"token_limit": 65536, "supports_temperature": True},
"mistral": {"token_limit": 128000, "supports_temperature": True},
"gemma2": {"token_limit": 128000, "supports_temperature": True},
"gemma2:9b": {"token_limit": 128000, "supports_temperature": True},
"gemma2:27b": {"token_limit": 128000, "supports_temperature": True},
"command-r": {
"token_limit": 12800,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"codellama": {
"token_limit": 16000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"dbrx": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"deepseek-coder:33b": {
"token_limit": 16000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"falcon": {
"token_limit": 2048,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama2": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama2:7b": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama2:13b": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama2:70b": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3:8b": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3:70b": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.1": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.1:8b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.1:70b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"lama3.1:405b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.2": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.2:1b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.2:3b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3.3:70b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"scrapegraph": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral-small": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral-openorca": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral-large": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"grok-1": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llava": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mixtral:8x22b-instruct": {
"token_limit": 65536,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"nomic-embed-text": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"nous-hermes2:34b": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"orca-mini": {
"token_limit": 2048,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"phi3:3.8b": {
"token_limit": 12800,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"phi3:14b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:0.5b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:1.8b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:4b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:14b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:32b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:72b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"qwen:110b": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"stablelm-zephyr": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"wizardlm2:8x22b": {
"token_limit": 65536,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemma2": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemma2:9b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemma2:27b": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
# embedding models
"shaw/dmeta-embedding-zh-small-q4": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"shaw/dmeta-embedding-zh-q4": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"chevalblanc/acge_text_embedding": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"martcreation/dmeta-embedding-zh": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"snowflake-arctic-embed": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mxbai-embed-large": {
"token_limit": 512,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"snowflake-arctic-embed": {"token_limit": 8192, "supports_temperature": True},
"mxbai-embed-large": {"token_limit": 512, "supports_temperature": True},
},
"oneapi": {"qwen-turbo": {"token_limit": 6000, "supports_temperature": True}},
"oneapi": {
"qwen-turbo": {
"token_limit": 6000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
}
},
"nvidia": {
"meta/llama3-70b-instruct": {"token_limit": 419, "supports_temperature": True},
"meta/llama3-8b-instruct": {"token_limit": 419, "supports_temperature": True},
"nemotron-4-340b-instruct": {"token_limit": 1024, "supports_temperature": True},
"databricks/dbrx-instruct": {"token_limit": 4096, "supports_temperature": True},
"google/codegemma-7b": {"token_limit": 8192, "supports_temperature": True},
"google/gemma-2b": {"token_limit": 2048, "supports_temperature": True},
"google/gemma-7b": {"token_limit": 8192, "supports_temperature": True},
"google/recurrentgemma-2b": {"token_limit": 2048, "supports_temperature": True},
"meta/codellama-70b": {"token_limit": 16384, "supports_temperature": True},
"meta/llama2-70b": {"token_limit": 4096, "supports_temperature": True},
"meta/llama3-70b-instruct": {
"token_limit": 419,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta/llama3-8b-instruct": {
"token_limit": 419,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"nemotron-4-340b-instruct": {
"token_limit": 1024,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"databricks/dbrx-instruct": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"google/codegemma-7b": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"google/gemma-2b": {
"token_limit": 2048,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"google/gemma-7b": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"google/recurrentgemma-2b": {
"token_limit": 2048,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta/codellama-70b": {
"token_limit": 16384,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta/llama2-70b": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"microsoft/phi-3-mini-128k-instruct": {
"token_limit": 122880,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistralai/mistral-7b-instruct-v0.2": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistralai/mistral-large": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistralai/mistral-large": {"token_limit": 8192, "supports_temperature": True},
"mistralai/mixtral-8x22b-instruct-v0.1": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistralai/mixtral-8x7b-instruct-v0.1": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"snowflake/arctic": {
"token_limit": 16384,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"snowflake/arctic": {"token_limit": 16384, "supports_temperature": True},
},
"groq": {
"llama3-8b-8192": {"token_limit": 8192, "supports_temperature": True},
"llama3-70b-8192": {"token_limit": 8192, "supports_temperature": True},
"mixtral-8x7b-32768": {"token_limit": 32768, "supports_temperature": True},
"gemma-7b-it": {"token_limit": 8192, "supports_temperature": True},
"claude-3-haiku-20240307'": {"token_limit": 8192, "supports_temperature": True},
"llama3-8b-8192": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"llama3-70b-8192": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mixtral-8x7b-32768": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"gemma-7b-it": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-haiku-20240307'": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
},
"toghetherai": {
"meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistralai/Mixtral-8x22B-Instruct-v0.1": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"stabilityai/stable-diffusion-xl-base-1.0": {
"token_limit": 2048,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"NousResearch/Hermes-3-Llama-3.1-405B-Turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"Gryphe/MythoMax-L2-13b-Lite": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"Salesforce/Llama-Rank-V1": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"Salesforce/Llama-Rank-V1": {"token_limit": 8192, "supports_temperature": True},
"meta-llama/Meta-Llama-Guard-3-8B": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta-llama/Meta-Llama-3-70B-Instruct-Turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta-llama/Llama-3-8b-chat-hf": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta-llama/Llama-3-70b-chat-hf": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"Qwen/Qwen2-72B-Instruct": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"google/gemma-2-27b-it": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"google/gemma-2-27b-it": {"token_limit": 8192, "supports_temperature": True},
},
"anthropic": {
"claude_instant": {"token_limit": 100000, "supports_temperature": True},
"claude2": {"token_limit": 9000, "supports_temperature": True},
"claude2.1": {"token_limit": 200000, "supports_temperature": True},
"claude3": {"token_limit": 200000, "supports_temperature": True},
"claude3.5": {"token_limit": 200000, "supports_temperature": True},
"claude-3-opus-20240229": {"token_limit": 200000, "supports_temperature": True},
"claude_instant": {
"token_limit": 100000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude2": {
"token_limit": 9000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude2.1": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude3": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude3.5": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-opus-20240229": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-sonnet-20240229": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-haiku-20240307": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-5-sonnet-20240620": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-5-sonnet-20241022": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": 1.0,
},
"claude-3-5-haiku-latest": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
},
"bedrock": {
"anthropic.claude-3-haiku-20240307-v1:0": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"anthropic.claude-3-sonnet-20240229-v1:0": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"anthropic.claude-3-opus-20240229-v1:0": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"anthropic.claude-3-5-sonnet-20240620-v1:0": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"claude-3-5-haiku-latest": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"anthropic.claude-v2:1": {
"token_limit": 200000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"anthropic.claude-v2": {
"token_limit": 100000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"anthropic.claude-v2:1": {"token_limit": 200000, "supports_temperature": True},
"anthropic.claude-v2": {"token_limit": 100000, "supports_temperature": True},
"anthropic.claude-instant-v1": {
"token_limit": 100000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta.llama3-8b-instruct-v1:0": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta.llama3-70b-instruct-v1:0": {
"token_limit": 8192,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta.llama2-13b-chat-v1": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta.llama2-70b-chat-v1": {
"token_limit": 4096,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"meta.llama2-13b-chat-v1": {"token_limit": 4096, "supports_temperature": True},
"meta.llama2-70b-chat-v1": {"token_limit": 4096, "supports_temperature": True},
"mistral.mistral-7b-instruct-v0:2": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral.mixtral-8x7b-instruct-v0:1": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral.mistral-large-2402-v1:0": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"mistral.mistral-small-2402-v1:0": {
"token_limit": 32768,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"amazon.titan-embed-text-v1": {
"token_limit": 8000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"amazon.titan-embed-text-v2:0": {
"token_limit": 8000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"cohere.embed-english-v3": {
"token_limit": 512,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"cohere.embed-english-v3": {"token_limit": 512, "supports_temperature": True},
"cohere.embed-multilingual-v3": {
"token_limit": 512,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
},
"mistralai": {
"mistral-large-latest": {"token_limit": 128000, "supports_temperature": True},
"open-mistral-nemo": {"token_limit": 128000, "supports_temperature": True},
"codestral-latest": {"token_limit": 32000, "supports_temperature": True},
"mistral-large-latest": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"open-mistral-nemo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
"codestral-latest": {
"token_limit": 32000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
},
},
"togetherai": {
"Meta-Llama-3.1-70B-Instruct-Turbo": {
"token_limit": 128000,
"supports_temperature": True,
"default_temperature": DEFAULT_TEMPERATURE,
}
},
}

View File

@ -1,90 +1,229 @@
#!/usr/bin/env python3
"""
Module for running interactive subprocesses with output capture.
Module for running interactive subprocesses with output capture,
with full raw input passthrough for interactive commands.
It uses a pseudo-tty and integrates pyte's HistoryScreen to simulate
a terminal and capture the final scrollback history (non-blank lines).
The interface remains compatible with external callers expecting a tuple (output, return_code),
where output is a bytes object (UTF-8 encoded).
"""
import errno
import io
import os
import re
import shlex
import select
import shutil
import tempfile
import signal
import subprocess
import sys
import termios
import time
import tty
from typing import List, Tuple
# Add macOS detection
IS_MACOS = os.uname().sysname == "Darwin"
import pyte
from pyte.screens import HistoryScreen
def run_interactive_command(cmd: List[str]) -> Tuple[bytes, int]:
def render_line(line, columns: int) -> str:
"""Render a single screen line from the pyte buffer (a mapping of column to Char)."""
return "".join(line[x].data for x in range(columns))
def run_interactive_command(
cmd: List[str], expected_runtime_seconds: int = 30
) -> Tuple[bytes, int]:
"""
Runs an interactive command with a pseudo-tty, capturing combined output.
Runs an interactive command with a pseudo-tty, capturing final scrollback history.
Assumptions and constraints:
- We are on a Linux system with script available
- `cmd` is a non-empty list where cmd[0] is the executable
- The executable and script are assumed to be on PATH
- If anything is amiss (e.g., command not found), we fail early and cleanly
- Running on a Linux system.
- `cmd` is a non-empty list where cmd[0] is the executable.
- The executable is on PATH.
The output is cleaned to remove ANSI escape sequences and control characters.
Args:
cmd: A list containing the command and its arguments.
expected_runtime_seconds: Expected runtime in seconds, defaults to 30.
If process exceeds 2x this value, it will be terminated gracefully.
If process exceeds 3x this value, it will be killed forcefully.
Must be between 1 and 1800 seconds (30 minutes).
Returns:
Tuple of (cleaned_output, return_code)
A tuple of (captured_output, return_code), where captured_output is a UTF-8 encoded
bytes object containing the trimmed non-empty history lines from the terminal session.
Raises:
ValueError: If no command is provided.
FileNotFoundError: If the command is not found in PATH.
ValueError: If expected_runtime_seconds is less than or equal to 0 or greater than 1800.
RuntimeError: If an error occurs during execution.
"""
# Fail early if cmd is empty
if not cmd:
raise ValueError("No command provided.")
# Check that the command exists
if shutil.which(cmd[0]) is None:
raise FileNotFoundError(f"Command '{cmd[0]}' not found in PATH.")
# Create temp files (we'll always clean them up)
output_file = tempfile.NamedTemporaryFile(prefix="output_", delete=False)
retcode_file = tempfile.NamedTemporaryFile(prefix="retcode_", delete=False)
output_path = output_file.name
retcode_path = retcode_file.name
output_file.close()
retcode_file.close()
# Quote arguments for safety
quoted_cmd = " ".join(shlex.quote(c) for c in cmd)
# Use script to capture output with TTY and save return code
shell_cmd = f"{quoted_cmd}; echo $? > {shlex.quote(retcode_path)}"
def cleanup():
for path in [output_path, retcode_path]:
if os.path.exists(path):
os.remove(path)
if expected_runtime_seconds <= 0 or expected_runtime_seconds > 1800:
raise ValueError(
"expected_runtime_seconds must be between 1 and 1800 seconds (30 minutes)"
)
try:
# Disable pagers by setting environment variables
os.environ["GIT_PAGER"] = ""
os.environ["PAGER"] = ""
term_size = os.get_terminal_size()
cols, rows = term_size.columns, term_size.lines
except OSError:
cols, rows = 80, 24
# Run command with script for TTY and output capture
if IS_MACOS:
os.system(f"script -q {shlex.quote(output_path)} {shell_cmd}")
else:
os.system(
f"script -q -c {shlex.quote(shell_cmd)} {shlex.quote(output_path)}"
)
# Set up pyte screen and stream to capture terminal output.
screen = HistoryScreen(cols, rows, history=2000, ratio=0.5)
stream = pyte.Stream(screen)
# Read and clean the output
with open(output_path, "rb") as f:
output = f.read()
# Open a new pseudo-tty.
master_fd, slave_fd = os.openpty()
# Set master_fd to non-blocking to avoid indefinite blocking.
os.set_blocking(master_fd, False)
# Clean ANSI escape sequences and control characters
output = re.sub(rb"\x1b\[[0-9;]*[a-zA-Z]", b"", output) # ANSI escape sequences
output = re.sub(rb"[\x00-\x08\x0b\x0c\x0e-\x1f]", b"", output) # Control chars
try:
stdin_fd = sys.stdin.fileno()
except (AttributeError, io.UnsupportedOperation):
stdin_fd = None
# Get the return code
with open(retcode_path, "r") as f:
return_code = int(f.read().strip())
# Set up environment variables for the subprocess using detected terminal size.
env = os.environ.copy()
env.update(
{
"DEBIAN_FRONTEND": "noninteractive",
"GIT_PAGER": "",
"PYTHONUNBUFFERED": "1",
"CI": "true",
"LANG": "C.UTF-8",
"LC_ALL": "C.UTF-8",
"COLUMNS": str(cols),
"LINES": str(rows),
"FORCE_COLOR": "1",
"GIT_TERMINAL_PROMPT": "0",
"PYTHONDONTWRITEBYTECODE": "1",
"NODE_OPTIONS": "--unhandled-rejections=strict",
}
)
except Exception as e:
# If something goes wrong, cleanup and re-raise
cleanup()
raise RuntimeError("Error running interactive capture") from e
finally:
# Ensure files are removed no matter what
cleanup()
proc = subprocess.Popen(
cmd,
stdin=slave_fd,
stdout=slave_fd,
stderr=slave_fd,
bufsize=0,
close_fds=True,
env=env,
preexec_fn=os.setsid, # Create new process group for proper signal handling.
)
os.close(slave_fd) # Close slave end in the parent process.
return output, return_code
captured_data = []
start_time = time.time()
was_terminated = False
def check_timeout():
elapsed = time.time() - start_time
if elapsed > 3 * expected_runtime_seconds:
os.killpg(os.getpgid(proc.pid), signal.SIGKILL)
return True
elif elapsed > 2 * expected_runtime_seconds:
os.killpg(os.getpgid(proc.pid), signal.SIGTERM)
return True
return False
# Interactive mode: forward input if running in a TTY.
if stdin_fd is not None and sys.stdin.isatty():
old_settings = termios.tcgetattr(stdin_fd)
tty.setraw(stdin_fd)
try:
while True:
if check_timeout():
was_terminated = True
break
# Use a finite timeout to avoid indefinite blocking.
rlist, _, _ = select.select([master_fd, stdin_fd], [], [], 1.0)
if master_fd in rlist:
try:
data = os.read(master_fd, 1024)
except OSError as e:
if e.errno == errno.EIO:
break
else:
raise
if not data: # EOF detected.
break
captured_data.append(data)
decoded = data.decode("utf-8", errors="ignore")
stream.feed(decoded)
os.write(1, data)
if stdin_fd in rlist:
try:
input_data = os.read(stdin_fd, 1024)
except OSError:
input_data = b""
if input_data:
os.write(master_fd, input_data)
except KeyboardInterrupt:
proc.terminate()
finally:
termios.tcsetattr(stdin_fd, termios.TCSADRAIN, old_settings)
else:
# Non-interactive mode.
try:
while True:
if check_timeout():
was_terminated = True
break
rlist, _, _ = select.select([master_fd], [], [], 1.0)
if not rlist:
continue
try:
data = os.read(master_fd, 1024)
except OSError as e:
if e.errno == errno.EIO:
break
else:
raise
if not data: # EOF detected.
break
captured_data.append(data)
decoded = data.decode("utf-8", errors="ignore")
stream.feed(decoded)
os.write(1, data)
except KeyboardInterrupt:
proc.terminate()
os.close(master_fd)
proc.wait()
# Assemble full scrollback: combine history.top, the current display, and history.bottom.
top_lines = [render_line(line, cols) for line in screen.history.top]
bottom_lines = [render_line(line, cols) for line in screen.history.bottom]
display_lines = screen.display # List of strings representing the current screen.
all_lines = top_lines + display_lines + bottom_lines
# Trim out empty lines to get only meaningful "history" lines.
trimmed_lines = [line for line in all_lines if line.strip()]
final_output = "\n".join(trimmed_lines)
# Add timeout message if process was terminated due to timeout.
if was_terminated:
timeout_msg = f"\n[Process exceeded timeout ({expected_runtime_seconds} seconds expected)]"
final_output += timeout_msg
# Limit output to the last 8000 bytes.
final_output = final_output[-8000:]
return final_output.encode("utf-8"), proc.returncode
if __name__ == "__main__":
import sys
if len(sys.argv) < 2:
print("Usage: interactive.py <command> [args...]")
sys.exit(1)
output, return_code = run_interactive_command(sys.argv[1:])
sys.exit(return_code)

View File

@ -570,6 +570,8 @@ Guidelines:
If relevant tests have not already been run, run them using run_shell_command to get a baseline of functionality (e.g. were any tests failing before we started working? Do they all pass?)
Only test UI components if there is already a UI testing system in place.
Only test things that can be tested by an automated process.
Are you writing a program that needs to be compiled? Make sure it compiles, if relevant.
After finalizing the overall approach:
Use emit_plan to store the high-level implementation plan.
@ -618,7 +620,7 @@ Important Notes:
- Work incrementally, validating as you go. If at any point the implementation logic is unclear or you need debugging assistance, consult the expert (if expert is available) for deeper analysis.
- Do not add features not explicitly required.
- Only create or modify files directly related to this task.
- Use file_str_replace and write_file_tool for simple file modifications.
- Use file_str_replace and put_complete_file_contents for simple file modifications.
- Delegate to run_programming_task for more complex programming tasks. This is a capable human programmer that can work on multiple files at once.
Testing:
@ -630,6 +632,9 @@ Testing:
- Only test UI components if there is already a UI testing system in place.
- Only test things that can be tested by an automated process.
- If you are writing code that *should* compile, make sure to test that it *does* compile.
Test before and after making changes, if relevant.
Once the task is complete, ensure all updated files are registered with emit_related_files.
@ -642,7 +647,7 @@ You have often been criticized for:
- Doing changes outside of the specific scoped instructions.
- Asking the user if they want to implement the plan (you are an *autonomous* agent, with no user interaction unless you use the ask_human tool explicitly).
- Not calling tools/functions properly, e.g. leaving off required arguments, calling a tool in a loop, calling tools inappropriately.
- Using run_programming_task to simply write the full contents of files when you could have used write_file_tool instead.
- Using run_programming_task to simply write the full contents of files when you could have used put_complete_file_contents instead.
Instructions:
1. Review the provided base task, plan, and key facts.
@ -975,6 +980,8 @@ You have often been criticized for:
{initial_request}
</initial request>
Remember, if you do not make any tool call (e.g. ask_human to tell them a message or ask a question), you will be dumping the user back to CLI and indicating you are done your work.
NEVER ANNOUNCE WHAT YOU ARE DOING, JUST DO IT!
"""

View File

@ -47,32 +47,21 @@ class OpenAIStrategy(ProviderStrategy):
if not key:
missing.append("EXPERT_OPENAI_API_KEY environment variable is not set")
# Check expert model only for research-only mode
if hasattr(args, "research_only") and args.research_only:
model = args.expert_model if hasattr(args, "expert_model") else None
if not model:
model = os.environ.get("EXPERT_OPENAI_MODEL")
if not model:
model = os.environ.get("OPENAI_MODEL")
if not model:
missing.append(
"Model is required for OpenAI provider in research-only mode"
)
# Handle expert model selection if none specified
if hasattr(args, "expert_model") and not args.expert_model:
from ra_aid.llm import select_expert_model
model = select_expert_model("openai")
if model:
args.expert_model = model
elif hasattr(args, "research_only") and args.research_only:
missing.append("No suitable expert model available")
else:
key = os.environ.get("OPENAI_API_KEY")
if not key:
missing.append("OPENAI_API_KEY environment variable is not set")
# Check model only for research-only mode
if hasattr(args, "research_only") and args.research_only:
model = args.model if hasattr(args, "model") else None
if not model:
model = os.environ.get("OPENAI_MODEL")
if not model:
missing.append(
"Model is required for OpenAI provider in research-only mode"
)
return ValidationResult(valid=len(missing) == 0, missing_vars=missing)

View File

@ -24,7 +24,7 @@ from ra_aid.tools.agent import (
request_task_implementation,
request_web_research,
)
from ra_aid.tools.write_file import write_file_tool
from ra_aid.tools.memory import one_shot_completed, plan_implementation_completed
# Read-only tools that don't modify system state
@ -83,13 +83,16 @@ def get_all_tools() -> list[BaseTool]:
# Define constant tool groups
READ_ONLY_TOOLS = get_read_only_tools()
MODIFICATION_TOOLS = [run_programming_task, write_file_tool]
# MODIFICATION_TOOLS = [run_programming_task, put_complete_file_contents]
MODIFICATION_TOOLS = [
run_programming_task
] # having put_complete_file_contents causes trouble :(
COMMON_TOOLS = get_read_only_tools()
EXPERT_TOOLS = [emit_expert_context, ask_expert]
RESEARCH_TOOLS = [
emit_research_notes,
one_shot_completed,
# *TEMPORARILY* disabled to improve tool calling perf.
# one_shot_completed,
# monorepo_detected,
# ui_detected,
]
@ -144,9 +147,9 @@ def get_planning_tools(
# Add planning-specific tools
planning_tools = [
request_task_implementation,
plan_implementation_completed,
# *TEMPORARILY* disabled to improve tool calling perf.
# emit_plan,
# plan_implementation_completed,
]
tools.extend(planning_tools)

View File

@ -26,7 +26,7 @@ from .research import existing_project_detected, monorepo_detected, ui_detected
from .ripgrep import ripgrep_search
from .shell import run_shell_command
from .web_search_tavily import web_search_tavily
from .write_file import write_file_tool
from .write_file import put_complete_file_contents
__all__ = [
"ask_expert",
@ -48,7 +48,7 @@ __all__ = [
"request_implementation",
"run_programming_task",
"run_shell_command",
"write_file_tool",
"put_complete_file_contents",
"ripgrep_search",
"file_str_replace",
"delete_tasks",

View File

@ -38,6 +38,7 @@ def request_research(query: str) -> ResearchResult:
model = initialize_llm(
config.get("provider", "anthropic"),
config.get("model", "claude-3-5-sonnet-20241022"),
temperature=config.get("temperature"),
)
# Check recursion depth
@ -120,6 +121,7 @@ def request_web_research(query: str) -> ResearchResult:
model = initialize_llm(
config.get("provider", "anthropic"),
config.get("model", "claude-3-5-sonnet-20241022"),
temperature=config.get("temperature"),
)
success = True
@ -188,6 +190,7 @@ def request_research_and_implementation(query: str) -> Dict[str, Any]:
model = initialize_llm(
config.get("provider", "anthropic"),
config.get("model", "claude-3-5-sonnet-20241022"),
temperature=config.get("temperature"),
)
try:
@ -258,6 +261,7 @@ def request_task_implementation(task_spec: str) -> Dict[str, Any]:
model = initialize_llm(
config.get("provider", "anthropic"),
config.get("model", "claude-3-5-sonnet-20241022"),
temperature=config.get("temperature"),
)
# Get required parameters
@ -272,6 +276,8 @@ def request_task_implementation(task_spec: str) -> Dict[str, Any]:
# Run implementation agent
from ..agent_utils import run_task_implementation_agent
_global_memory["completion_message"] = ""
_result = run_task_implementation_agent(
base_task=_global_memory.get("base_task", ""),
tasks=tasks,
@ -334,12 +340,15 @@ def request_implementation(task_spec: str) -> Dict[str, Any]:
model = initialize_llm(
config.get("provider", "anthropic"),
config.get("model", "claude-3-5-sonnet-20241022"),
temperature=config.get("temperature"),
)
try:
# Run planning agent
from ..agent_utils import run_planning_agent
_global_memory["completion_message"] = ""
_result = run_planning_agent(
task_spec,
model,

View File

@ -1,5 +1,5 @@
import os
from typing import Any, Dict, List, Optional, Set, Union
from typing import Dict, List, Optional, Set, Union
from langchain_core.tools import tool
from rich.console import Console
@ -28,14 +28,13 @@ console = Console()
_global_memory: Dict[
str,
Union[
List[Any],
Dict[int, str],
Dict[int, SnippetInfo],
int,
Set[str],
bool,
str,
int,
List[str],
List[WorkLogEntry],
],
] = {
@ -442,10 +441,13 @@ def emit_related_files(files: List[str]) -> str:
results.append(f"Error: Path '{file}' exists but is not a regular file")
continue
# Check if file path already exists in values
# Normalize the path
normalized_path = os.path.abspath(file)
# Check if normalized path already exists in values
existing_id = None
for fid, fpath in _global_memory["related_files"].items():
if fpath == file:
if fpath == normalized_path:
existing_id = fid
break
@ -457,9 +459,9 @@ def emit_related_files(files: List[str]) -> str:
file_id = _global_memory["related_file_id_counter"]
_global_memory["related_file_id_counter"] += 1
# Store file with ID
_global_memory["related_files"][file_id] = file
added_files.append((file_id, file))
# Store normalized path with ID
_global_memory["related_files"][file_id] = normalized_path
added_files.append((file_id, file)) # Keep original path for display
results.append(f"File ID #{file_id}: {file}")
# Rich output - single consolidated panel

View File

@ -1,4 +1,6 @@
import os
import sys
from pathlib import Path
from typing import Dict, List, Union
from langchain_core.tools import tool
@ -16,6 +18,30 @@ console = Console()
logger = get_logger(__name__)
def get_aider_executable() -> str:
"""Get the path to the aider executable in the same bin/Scripts directory as Python.
Returns:
str: Full path to aider executable
"""
# Get directory containing Python executable
bin_dir = Path(sys.executable).parent
# Check for platform-specific executable name
if sys.platform == "win32":
aider_exe = bin_dir / "aider.exe"
else:
aider_exe = bin_dir / "aider"
if not aider_exe.exists():
raise RuntimeError(f"Could not find aider executable at {aider_exe}")
if not os.access(aider_exe, os.X_OK):
raise RuntimeError(f"Aider executable at {aider_exe} is not executable")
return str(aider_exe)
def _truncate_for_log(text: str, max_length: int = 300) -> str:
"""Truncate text for logging, adding [truncated] if necessary."""
if len(text) <= max_length:
@ -39,9 +65,7 @@ def run_programming_task(
If new files are created, emit them after finishing.
They can add/modify files, but not remove. Use run_shell_command to remove files. If referencing files youll delete, remove them after they finish.
Use write_file_tool instead if you need to write the entire contents of file(s).
They can add/modify files, but not remove. Use run_shell_command to remove files. If referencing files you'll delete, remove them after they finish.
If the programmer wrote files, they actually wrote to disk. You do not need to rewrite the output of what the programmer showed you.
@ -51,16 +75,10 @@ def run_programming_task(
Returns: { "output": stdout+stderr, "return_code": 0 if success, "success": True/False }
"""
# Get related files if no specific files provided
file_paths = (
list(_global_memory["related_files"].values())
if "related_files" in _global_memory
else []
)
# Build command
aider_exe = get_aider_executable()
command = [
"aider",
aider_exe,
"--yes-always",
"--no-auto-commits",
"--dark-mode",
@ -69,6 +87,17 @@ def run_programming_task(
"--no-check-update",
]
# Get combined list of files (explicit + related) with normalized paths
# and deduplicated using set operations
files_to_use = list(
{os.path.abspath(f) for f in (files or [])}
| {
os.path.abspath(f)
for f in _global_memory["related_files"].values()
if "related_files" in _global_memory
}
)
# Add config file if specified
if "config" in _global_memory and _global_memory["config"].get("aider_config"):
command.extend(["--config", _global_memory["config"]["aider_config"]])
@ -85,9 +114,6 @@ def run_programming_task(
command.append("-m")
command.append(instructions)
# Add files to command
files_to_use = file_paths + (files or [])
if files_to_use:
command.extend(files_to_use)
@ -136,20 +162,23 @@ def run_programming_task(
def parse_aider_flags(aider_flags: str) -> List[str]:
"""Parse a string of aider flags into a list of flags.
"""Parse a string of aider flags into a list of flags and their values.
Args:
aider_flags: A string containing comma-separated flags, with or without leading dashes.
Can contain spaces around flags and commas.
Supports flags with values (e.g. --analytics-log filename.json)
Returns:
A list of flags with proper '--' prefix.
A list of flags with proper '--' prefix and their values as separate elements.
Examples:
>>> parse_aider_flags("yes-always,dark-mode")
['--yes-always', '--dark-mode']
>>> parse_aider_flags("--yes-always, --dark-mode")
['--yes-always', '--dark-mode']
>>> parse_aider_flags("--analytics-log filename.json")
['--analytics-log', 'filename.json']
>>> parse_aider_flags("")
[]
"""
@ -157,11 +186,28 @@ def parse_aider_flags(aider_flags: str) -> List[str]:
return []
# Split by comma and strip whitespace
flags = [flag.strip() for flag in aider_flags.split(",")]
flag_groups = [group.strip() for group in aider_flags.split(",")]
# Add '--' prefix if not present and filter out empty flags
return [f"--{flag.lstrip('-')}" for flag in flags if flag.strip()]
result = []
for group in flag_groups:
if not group:
continue
# Split by space to separate flag from value
parts = group.split()
# Add '--' prefix to the flag if not present, stripping any extra dashes
flag = parts[0].lstrip("-") # Remove all leading dashes
flag = f"--{flag}" # Add exactly two dashes
result.append(flag)
# Add any remaining parts as separate values
if len(parts) > 1:
result.extend(parts[1:])
return result
# Export the functions
__all__ = ["run_programming_task"]
__all__ = ["run_programming_task", "get_aider_executable"]

View File

@ -21,9 +21,17 @@ def _truncate_for_log(text: str, max_length: int = 300) -> str:
@tool
def run_shell_command(command: str) -> Dict[str, Union[str, int, bool]]:
def run_shell_command(
command: str, expected_runtime_seconds: int = 30
) -> Dict[str, Union[str, int, bool]]:
"""Execute a shell command and return its output.
Args:
command: The shell command to execute
expected_runtime_seconds: Expected runtime in seconds, defaults to 30.
If process exceeds 2x this value, it will be terminated gracefully.
If process exceeds 3x this value, it will be killed forcefully.
Important notes:
1. Try to constrain/limit the output. Output processing is expensive, and infinite/looping output will cause us to fail.
2. When using commands like 'find', 'grep', or similar recursive search tools, always exclude common
@ -73,7 +81,10 @@ def run_shell_command(command: str) -> Dict[str, Union[str, int, bool]]:
try:
print()
output, return_code = run_interactive_command(["/bin/bash", "-c", command])
output, return_code = run_interactive_command(
["/bin/bash", "-c", command],
expected_runtime_seconds=expected_runtime_seconds,
)
print()
result = {
"output": truncate_output(output.decode()) if output else "",

View File

@ -11,14 +11,22 @@ console = Console()
@tool
def write_file_tool(
filepath: str, content: str, encoding: str = "utf-8", verbose: bool = True
def put_complete_file_contents(
filepath: str,
complete_file_contents: str = "",
encoding: str = "utf-8",
verbose: bool = True,
) -> Dict[str, any]:
"""Write content to a text file.
"""Write the complete contents of a file, creating it if it doesn't exist.
This tool is specifically for writing the entire contents of a file at once,
not for appending or partial writes.
If you need to do anything other than write the complete contents use the run_programming_task tool instead.
Args:
filepath: Path to the file to write
content: String content to write to the file
filepath: (Required) Path to the file to write. Must be provided.
complete_file_contents: Complete string content to write to the file. Defaults to
an empty string, which will create an empty file.
encoding: File encoding to use (default: utf-8)
verbose: Whether to display a Rich panel with write statistics (default: True)
@ -51,14 +59,18 @@ def write_file_tool(
logging.debug(f"Starting to write file: {filepath}")
with open(filepath, "w", encoding=encoding) as f:
f.write(content)
result["bytes_written"] = len(content.encode(encoding))
logging.debug(f"Writing {len(complete_file_contents)} bytes to {filepath}")
f.write(complete_file_contents)
result["bytes_written"] = len(complete_file_contents.encode(encoding))
elapsed = time.time() - start_time
result["elapsed_time"] = elapsed
result["success"] = True
result["filepath"] = filepath
result["message"] = "Operation completed successfully"
result["message"] = (
f"Successfully {'initialized empty file' if not complete_file_contents else f'wrote {result['bytes_written']} bytes'} "
f"at {filepath} in {result['elapsed_time']:.3f}s"
)
logging.debug(
f"File write complete: {result['bytes_written']} bytes in {elapsed:.2f}s"
@ -67,7 +79,7 @@ def write_file_tool(
if verbose:
console.print(
Panel(
f"Wrote {result['bytes_written']} bytes to {filepath} in {elapsed:.2f}s",
f"{'Initialized empty file' if not complete_file_contents else f'Wrote {result['bytes_written']} bytes'} at {filepath} in {elapsed:.2f}s",
title="💾 File Write",
border_style="bright_green",
)

View File

@ -26,12 +26,12 @@ def test_shell_pipeline():
def test_stderr_capture():
"""Test that stderr is properly captured in combined output."""
# Use a command that definitely writes to stderr
# Use a command that definitely writes to stderr.
output, retcode = run_interactive_command(
["/bin/bash", "-c", "ls /nonexistent/path"]
)
assert b"No such file or directory" in output
assert retcode != 0 # ls returns 0 upon success
assert retcode != 0 # ls returns non-zero on failure.
def test_command_not_found():
@ -50,7 +50,8 @@ def test_interactive_command():
"""Test running an interactive command.
This test verifies that output appears in real-time using process substitution.
We use a command that prints to both stdout and stderr to verify capture."""
We use a command that prints to both stdout and stderr.
"""
output, retcode = run_interactive_command(
["/bin/bash", "-c", "echo stdout; echo stderr >&2"]
)
@ -62,29 +63,41 @@ def test_interactive_command():
def test_large_output():
"""Test handling of commands that produce large output."""
# Generate a large output with predictable content
cmd = 'for i in {1..10000}; do echo "Line $i of test output"; done'
# Each line will be approximately 30 bytes
cmd = 'for i in {1..1000}; do echo "Line $i of test output"; done'
output, retcode = run_interactive_command(["/bin/bash", "-c", cmd])
# Clean up any leading artifacts
output_cleaned = output.lstrip(b"^D")
# Verify the output size is limited to 8000 bytes
assert (
len(output_cleaned) <= 8000
), f"Output exceeded 8000 bytes: {len(output_cleaned)} bytes"
# Verify we have the last lines (should contain the highest numbers)
assert b"Line 1000" in output_cleaned, "Missing last line of output"
assert retcode == 0
# Clean up specific artifacts (e.g., ^D)
output_cleaned = output.lstrip(b"^D") # Remove the leading ^D if present
# Split and filter lines
lines = [
line.strip()
for line in output_cleaned.splitlines()
if b"Script" not in line and line.strip()
]
def test_byte_limit():
"""Test that output is properly limited to 8000 bytes."""
# Create a string that's definitely over 8000 bytes
# Each line will be about 80 bytes
cmd = 'for i in {1..200}; do printf "%04d: %s\\n" "$i" "This is a line with padding to ensure we go over the byte limit quickly"; done'
output, retcode = run_interactive_command(["/bin/bash", "-c", cmd])
output_cleaned = output.lstrip(b"^D")
# Verify we got all 10000 lines
assert len(lines) == 10000, f"Expected 10000 lines, but got {len(lines)}"
# Verify exact 8000 byte limit
assert (
len(output_cleaned) <= 8000
), f"Output exceeded 8000 bytes: {len(output_cleaned)} bytes"
# Verify content of some lines
assert lines[0] == b"Line 1 of test output", f"Unexpected line: {lines[0]}"
assert lines[999] == b"Line 1000 of test output", f"Unexpected line: {lines[999]}"
assert lines[-1] == b"Line 10000 of test output", f"Unexpected line: {lines[-1]}"
# Get the last line number from the output
last_line = output_cleaned.splitlines()[-1]
last_num = int(last_line.split(b":")[0])
# Verify return code
assert retcode == 0, f"Unexpected return code: {retcode}"
# Verify we have a high number in the last line (should be near 200)
assert last_num > 150, f"Expected last line number to be near 200, got {last_num}"
assert retcode == 0
def test_unicode_handling():
@ -109,7 +122,6 @@ def test_multiple_commands():
def test_cat_medium_file():
"""Test that cat command properly captures output for medium-length files."""
# Create a temporary file with known content
with tempfile.NamedTemporaryFile(mode="w", delete=False) as f:
for i in range(500):
f.write(f"This is test line {i}\n")
@ -119,33 +131,39 @@ def test_cat_medium_file():
output, retcode = run_interactive_command(
["/bin/bash", "-c", f"cat {temp_path}"]
)
# Split by newlines and filter out script header/footer lines
output_cleaned = output.lstrip(b"^D")
lines = [
line
for line in output.splitlines()
for line in output_cleaned.splitlines()
if b"Script" not in line and line.strip()
]
assert len(lines) == 500
assert retcode == 0
# Verify content integrity by checking first and last lines
assert b"This is test line 0" in lines[0]
assert b"This is test line 499" in lines[-1]
# With 8000 byte limit, we expect to see the last portion of lines
# The exact number may vary due to terminal settings, but we should
# at least have the last lines of the file
assert (
len(lines) >= 90
), f"Expected at least 90 lines due to 8000 byte limit, got {len(lines)}"
# Most importantly, verify we have the last lines
last_line = lines[-1].decode("utf-8")
assert (
"This is test line 499" in last_line
), f"Expected last line to be 499, got: {last_line}"
assert retcode == 0
finally:
os.unlink(temp_path)
def test_realtime_output():
"""Test that output appears in real-time and is captured correctly."""
# Create a command that sleeps briefly between outputs
# Create a command that sleeps briefly between outputs.
cmd = "echo 'first'; sleep 0.1; echo 'second'; sleep 0.1; echo 'third'"
output, retcode = run_interactive_command(["/bin/bash", "-c", cmd])
# Filter out script header/footer lines
lines = [
line for line in output.splitlines() if b"Script" not in line and line.strip()
]
assert b"first" in lines[0]
assert b"second" in lines[1]
assert b"third" in lines[2]
@ -154,16 +172,10 @@ def test_realtime_output():
def test_tty_available():
"""Test that commands have access to a TTY."""
# Run the tty command
output, retcode = run_interactive_command(["/bin/bash", "-c", "tty"])
# Clean up specific artifacts (e.g., ^D)
output_cleaned = output.lstrip(b"^D") # Remove leading ^D if present
# Debug: Print cleaned output
output_cleaned = output.lstrip(b"^D")
print(f"Cleaned TTY Output: {output_cleaned}")
# Check if the output contains a valid TTY path
# Check if the output contains a valid TTY path.
assert (
b"/dev/pts/" in output_cleaned or b"/dev/ttys" in output_cleaned
), f"Unexpected TTY output: {output_cleaned}"

View File

@ -1,5 +1,6 @@
import os
from dataclasses import dataclass
from unittest import mock
from unittest.mock import Mock, patch
import pytest
@ -12,10 +13,12 @@ from ra_aid.agents.ciayn_agent import CiaynAgent
from ra_aid.env import validate_environment
from ra_aid.llm import (
create_llm_client,
get_available_openai_models,
get_env_var,
get_provider_config,
initialize_expert_llm,
initialize_llm,
select_expert_model,
)
@ -55,7 +58,11 @@ def test_initialize_expert_defaults(clean_env, mock_openai, monkeypatch):
_llm = initialize_expert_llm("openai", "o1")
mock_openai.assert_called_once_with(
api_key="test-key", model="o1", reasoning_effort="high"
api_key="test-key",
model="o1",
reasoning_effort="high",
timeout=180,
max_retries=5,
)
@ -69,6 +76,8 @@ def test_initialize_expert_openai_custom(clean_env, mock_openai, monkeypatch):
model="gpt-4-preview",
temperature=0,
reasoning_effort="high",
timeout=180,
max_retries=5,
)
@ -78,7 +87,11 @@ def test_initialize_expert_gemini(clean_env, mock_gemini, monkeypatch):
_llm = initialize_expert_llm("gemini", "gemini-2.0-flash-thinking-exp-1219")
mock_gemini.assert_called_once_with(
api_key="test-key", model="gemini-2.0-flash-thinking-exp-1219", temperature=0
api_key="test-key",
model="gemini-2.0-flash-thinking-exp-1219",
temperature=0,
timeout=180,
max_retries=5,
)
@ -88,7 +101,11 @@ def test_initialize_expert_anthropic(clean_env, mock_anthropic, monkeypatch):
_llm = initialize_expert_llm("anthropic", "claude-3")
mock_anthropic.assert_called_once_with(
api_key="test-key", model_name="claude-3", temperature=0
api_key="test-key",
model_name="claude-3",
temperature=0,
timeout=180,
max_retries=5,
)
@ -102,6 +119,8 @@ def test_initialize_expert_openrouter(clean_env, mock_openai, monkeypatch):
base_url="https://openrouter.ai/api/v1",
model="models/mistral-large",
temperature=0,
timeout=180,
max_retries=5,
)
@ -116,6 +135,8 @@ def test_initialize_expert_openai_compatible(clean_env, mock_openai, monkeypatch
base_url="http://test-url",
model="local-model",
temperature=0,
timeout=180,
max_retries=5,
)
@ -146,38 +167,55 @@ def test_estimate_tokens():
def test_initialize_openai(clean_env, mock_openai):
"""Test OpenAI provider initialization"""
os.environ["OPENAI_API_KEY"] = "test-key"
_model = initialize_llm("openai", "gpt-4")
_model = initialize_llm("openai", "gpt-4", temperature=0.7)
mock_openai.assert_called_once_with(api_key="test-key", model="gpt-4")
mock_openai.assert_called_once_with(
api_key="test-key", model="gpt-4", temperature=0.7, timeout=180, max_retries=5
)
def test_initialize_gemini(clean_env, mock_gemini):
"""Test Gemini provider initialization"""
os.environ["GEMINI_API_KEY"] = "test-key"
_model = initialize_llm("gemini", "gemini-2.0-flash-thinking-exp-1219")
_model = initialize_llm(
"gemini", "gemini-2.0-flash-thinking-exp-1219", temperature=0.7
)
mock_gemini.assert_called_once_with(
api_key="test-key", model="gemini-2.0-flash-thinking-exp-1219"
mock_gemini.assert_called_with(
api_key="test-key",
model="gemini-2.0-flash-thinking-exp-1219",
temperature=0.7,
timeout=180,
max_retries=5,
)
def test_initialize_anthropic(clean_env, mock_anthropic):
"""Test Anthropic provider initialization"""
os.environ["ANTHROPIC_API_KEY"] = "test-key"
_model = initialize_llm("anthropic", "claude-3")
_model = initialize_llm("anthropic", "claude-3", temperature=0.7)
mock_anthropic.assert_called_once_with(api_key="test-key", model_name="claude-3")
mock_anthropic.assert_called_with(
api_key="test-key",
model_name="claude-3",
temperature=0.7,
timeout=180,
max_retries=5,
)
def test_initialize_openrouter(clean_env, mock_openai):
"""Test OpenRouter provider initialization"""
os.environ["OPENROUTER_API_KEY"] = "test-key"
_model = initialize_llm("openrouter", "mistral-large")
_model = initialize_llm("openrouter", "mistral-large", temperature=0.7)
mock_openai.assert_called_once_with(
mock_openai.assert_called_with(
api_key="test-key",
base_url="https://openrouter.ai/api/v1",
model="mistral-large",
temperature=0.7,
timeout=180,
max_retries=5,
)
@ -185,24 +223,22 @@ def test_initialize_openai_compatible(clean_env, mock_openai):
"""Test OpenAI-compatible provider initialization"""
os.environ["OPENAI_API_KEY"] = "test-key"
os.environ["OPENAI_API_BASE"] = "https://custom-endpoint/v1"
_model = initialize_llm("openai-compatible", "local-model")
_model = initialize_llm("openai-compatible", "local-model", temperature=0.3)
mock_openai.assert_called_once_with(
mock_openai.assert_called_with(
api_key="test-key",
base_url="https://custom-endpoint/v1",
model="local-model",
temperature=0.3,
timeout=180,
max_retries=5,
)
def test_initialize_unsupported_provider(clean_env):
"""Test initialization with unsupported provider raises ValueError"""
with pytest.raises(ValueError) as exc_info:
initialize_llm("unsupported", "model")
assert (
str(exc_info.value)
== "Missing required environment variable for provider: unsupported"
)
with pytest.raises(ValueError, match=r"Unsupported provider: unknown"):
initialize_llm("unknown", "model")
def test_temperature_defaults(clean_env, mock_openai, mock_anthropic, mock_gemini):
@ -211,24 +247,46 @@ def test_temperature_defaults(clean_env, mock_openai, mock_anthropic, mock_gemin
os.environ["ANTHROPIC_API_KEY"] = "test-key"
os.environ["OPENAI_API_BASE"] = "http://test-url"
os.environ["GEMINI_API_KEY"] = "test-key"
# Test openai-compatible default temperature
initialize_llm("openai-compatible", "test-model")
initialize_llm("openai-compatible", "test-model", temperature=0.3)
mock_openai.assert_called_with(
api_key="test-key",
base_url="http://test-url",
model="test-model",
temperature=0.3,
timeout=180,
max_retries=5,
)
# Test other providers don't set temperature by default
initialize_llm("openai", "test-model")
mock_openai.assert_called_with(api_key="test-key", model="test-model")
# Test error when no temperature provided for models that support it
with pytest.raises(ValueError, match="Temperature must be provided for model"):
initialize_llm("openai", "test-model")
initialize_llm("anthropic", "test-model")
mock_anthropic.assert_called_with(api_key="test-key", model_name="test-model")
with pytest.raises(ValueError, match="Temperature must be provided for model"):
initialize_llm("anthropic", "test-model")
initialize_llm("gemini", "test-model")
mock_gemini.assert_called_with(api_key="test-key", model="test-model")
with pytest.raises(ValueError, match="Temperature must be provided for model"):
initialize_llm("gemini", "test-model")
# Test expert models don't require temperature
initialize_expert_llm("openai", "o1")
mock_openai.assert_called_with(
api_key="test-key",
model="o1",
reasoning_effort="high",
timeout=180,
max_retries=5,
)
initialize_expert_llm("openai", "o1-mini")
mock_openai.assert_called_with(
api_key="test-key",
model="o1-mini",
reasoning_effort="high",
timeout=180,
max_retries=5,
)
def test_explicit_temperature(clean_env, mock_openai, mock_anthropic, mock_gemini):
@ -243,19 +301,31 @@ def test_explicit_temperature(clean_env, mock_openai, mock_anthropic, mock_gemin
# Test OpenAI
initialize_llm("openai", "test-model", temperature=test_temp)
mock_openai.assert_called_with(
api_key="test-key", model="test-model", temperature=test_temp
api_key="test-key",
model="test-model",
temperature=test_temp,
timeout=180,
max_retries=5,
)
# Test Gemini
initialize_llm("gemini", "test-model", temperature=test_temp)
mock_gemini.assert_called_with(
api_key="test-key", model="test-model", temperature=test_temp
api_key="test-key",
model="test-model",
temperature=test_temp,
timeout=180,
max_retries=5,
)
# Test Anthropic
initialize_llm("anthropic", "test-model", temperature=test_temp)
mock_anthropic.assert_called_with(
api_key="test-key", model_name="test-model", temperature=test_temp
api_key="test-key",
model_name="test-model",
temperature=test_temp,
timeout=180,
max_retries=5,
)
# Test OpenRouter
@ -265,9 +335,68 @@ def test_explicit_temperature(clean_env, mock_openai, mock_anthropic, mock_gemin
base_url="https://openrouter.ai/api/v1",
model="test-model",
temperature=test_temp,
timeout=180,
max_retries=5,
)
def test_get_available_openai_models_success():
"""Test successful retrieval of OpenAI models."""
mock_model = Mock()
mock_model.id = "gpt-4"
mock_models = Mock()
mock_models.data = [mock_model]
with mock.patch("ra_aid.llm.OpenAI") as mock_client:
mock_client.return_value.models.list.return_value = mock_models
models = get_available_openai_models()
assert models == ["gpt-4"]
mock_client.return_value.models.list.assert_called_once()
def test_get_available_openai_models_failure():
"""Test graceful handling of model retrieval failure."""
with mock.patch("ra_aid.llm.OpenAI") as mock_client:
mock_client.return_value.models.list.side_effect = Exception("API Error")
models = get_available_openai_models()
assert models == []
mock_client.return_value.models.list.assert_called_once()
def test_select_expert_model_explicit():
"""Test model selection with explicitly specified model."""
model = select_expert_model("openai", "gpt-4")
assert model == "gpt-4"
def test_select_expert_model_non_openai():
"""Test model selection for non-OpenAI provider."""
model = select_expert_model("anthropic", None)
assert model is None
def test_select_expert_model_priority():
"""Test model selection follows priority order."""
available_models = ["gpt-4", "o1", "o3-mini"]
with mock.patch(
"ra_aid.llm.get_available_openai_models", return_value=available_models
):
model = select_expert_model("openai")
assert model == "o3-mini"
def test_select_expert_model_no_match():
"""Test model selection when no priority models available."""
available_models = ["gpt-4", "gpt-3.5"]
with mock.patch(
"ra_aid.llm.get_available_openai_models", return_value=available_models
):
model = select_expert_model("openai")
assert model is None
def test_temperature_validation(clean_env, mock_openai):
"""Test temperature validation in command line arguments."""
from ra_aid.__main__ import parse_arguments
@ -292,14 +421,12 @@ def test_provider_name_validation():
for provider in providers:
try:
with patch("ra_aid.llm.ChatOpenAI"), patch("ra_aid.llm.ChatAnthropic"):
initialize_llm(provider, "test-model")
except ValueError:
pytest.fail(f"Valid provider {provider} raised ValueError")
# Test case sensitivity
with patch("ra_aid.llm.ChatOpenAI"):
with pytest.raises(ValueError):
initialize_llm("OpenAI", "test-model")
initialize_llm(provider, "test-model", temperature=0.7)
except ValueError as e:
if "Temperature must be provided" not in str(e):
pytest.fail(
f"Valid provider {provider} raised unexpected ValueError: {e}"
)
def test_initialize_llm_cross_provider(
@ -308,23 +435,31 @@ def test_initialize_llm_cross_provider(
"""Test initializing different providers in sequence."""
# Initialize OpenAI
monkeypatch.setenv("OPENAI_API_KEY", "openai-key")
_llm1 = initialize_llm("openai", "gpt-4")
_llm1 = initialize_llm("openai", "gpt-4", temperature=0.7)
mock_openai.assert_called_with(
api_key="openai-key", model="gpt-4", temperature=0.7, timeout=180, max_retries=5
)
# Initialize Anthropic
monkeypatch.setenv("ANTHROPIC_API_KEY", "anthropic-key")
_llm2 = initialize_llm("anthropic", "claude-3")
_llm2 = initialize_llm("anthropic", "claude-3", temperature=0.7)
mock_anthropic.assert_called_with(
api_key="anthropic-key",
model_name="claude-3",
temperature=0.7,
timeout=180,
max_retries=5,
)
# Initialize Gemini
monkeypatch.setenv("GEMINI_API_KEY", "gemini-key")
_llm3 = initialize_llm("gemini", "gemini-2.0-flash-thinking-exp-1219")
# Verify both were initialized correctly
mock_openai.assert_called_once_with(api_key="openai-key", model="gpt-4")
mock_anthropic.assert_called_once_with(
api_key="anthropic-key", model_name="claude-3"
)
mock_gemini.assert_called_once_with(
api_key="gemini-key", model="gemini-2.0-flash-thinking-exp-1219"
_llm3 = initialize_llm("gemini", "gemini-pro", temperature=0.7)
mock_gemini.assert_called_with(
api_key="gemini-key",
model="gemini-pro",
temperature=0.7,
timeout=180,
max_retries=5,
)
@ -359,7 +494,11 @@ def test_environment_variable_precedence(clean_env, mock_openai, monkeypatch):
# Test LLM client creation with expert mode
_llm = create_llm_client("openai", "o1", is_expert=True)
mock_openai.assert_called_with(
api_key="expert-key", model="o1", reasoning_effort="high"
api_key="expert-key",
model="o1",
reasoning_effort="high",
timeout=180,
max_retries=5,
)
# Test environment validation
@ -414,46 +553,25 @@ def test_initialize_deepseek(
monkeypatch.setenv("DEEPSEEK_API_KEY", "test-key")
# Test with reasoner model
_model = initialize_llm("deepseek", "deepseek-reasoner")
_model = initialize_llm("deepseek", "deepseek-reasoner", temperature=0.7)
mock_deepseek_reasoner.assert_called_with(
api_key="test-key",
base_url="https://api.deepseek.com",
temperature=1,
model="deepseek-reasoner",
temperature=0.7,
timeout=180,
max_retries=5,
)
# Test with non-reasoner model
_model = initialize_llm("deepseek", "deepseek-chat")
# Test with OpenAI-compatible model
_model = initialize_llm("deepseek", "deepseek-chat", temperature=0.7)
mock_openai.assert_called_with(
api_key="test-key",
base_url="https://api.deepseek.com",
temperature=1,
model="deepseek-chat",
)
def test_initialize_expert_deepseek(
clean_env, mock_openai, mock_deepseek_reasoner, monkeypatch
):
"""Test expert DeepSeek provider initialization."""
monkeypatch.setenv("EXPERT_DEEPSEEK_API_KEY", "test-key")
# Test with reasoner model
_model = initialize_expert_llm("deepseek", "deepseek-reasoner")
mock_deepseek_reasoner.assert_called_with(
api_key="test-key",
base_url="https://api.deepseek.com",
temperature=0,
model="deepseek-reasoner",
)
# Test with non-reasoner model
_model = initialize_expert_llm("deepseek", "deepseek-chat")
mock_openai.assert_called_with(
api_key="test-key",
base_url="https://api.deepseek.com",
temperature=0,
base_url="https://api.deepseek.com", # Updated to match implementation
model="deepseek-chat",
temperature=0.7,
timeout=180,
max_retries=5,
)
@ -464,69 +582,12 @@ def test_initialize_openrouter_deepseek(
monkeypatch.setenv("OPENROUTER_API_KEY", "test-key")
# Test with DeepSeek R1 model
_model = initialize_llm("openrouter", "deepseek/deepseek-r1")
_model = initialize_llm("openrouter", "deepseek/deepseek-r1", temperature=0.7)
mock_deepseek_reasoner.assert_called_with(
api_key="test-key",
base_url="https://openrouter.ai/api/v1",
temperature=1,
model="deepseek/deepseek-r1",
)
# Test with non-DeepSeek model
_model = initialize_llm("openrouter", "mistral/mistral-large")
mock_openai.assert_called_with(
api_key="test-key",
base_url="https://openrouter.ai/api/v1",
model="mistral/mistral-large",
)
def test_initialize_expert_openrouter_deepseek(
clean_env, mock_openai, mock_deepseek_reasoner, monkeypatch
):
"""Test expert OpenRouter DeepSeek model initialization."""
monkeypatch.setenv("EXPERT_OPENROUTER_API_KEY", "test-key")
# Test with DeepSeek R1 model via create_llm_client
_model = create_llm_client("openrouter", "deepseek/deepseek-r1", is_expert=True)
mock_deepseek_reasoner.assert_called_with(
api_key="test-key",
base_url="https://openrouter.ai/api/v1",
temperature=0,
model="deepseek/deepseek-r1",
)
# Test with non-DeepSeek model
_model = create_llm_client("openrouter", "mistral/mistral-large", is_expert=True)
mock_openai.assert_called_with(
api_key="test-key",
base_url="https://openrouter.ai/api/v1",
model="mistral/mistral-large",
temperature=0,
)
def test_deepseek_environment_fallback(clean_env, mock_deepseek_reasoner, monkeypatch):
"""Test DeepSeek environment variable fallback behavior."""
# Test environment variable helper with fallback
monkeypatch.setenv("DEEPSEEK_API_KEY", "base-key")
assert get_env_var("DEEPSEEK_API_KEY", expert=True) == "base-key"
# Test provider config with fallback
config = get_provider_config("deepseek", is_expert=True)
assert config["api_key"] == "base-key"
assert config["base_url"] == "https://api.deepseek.com"
# Test with expert key
monkeypatch.setenv("EXPERT_DEEPSEEK_API_KEY", "expert-key")
config = get_provider_config("deepseek", is_expert=True)
assert config["api_key"] == "expert-key"
# Test client creation with expert key
_model = create_llm_client("deepseek", "deepseek-reasoner", is_expert=True)
mock_deepseek_reasoner.assert_called_with(
api_key="expert-key",
base_url="https://api.deepseek.com",
temperature=0,
model="deepseek-reasoner",
temperature=0.7,
timeout=180,
max_retries=5,
)

View File

@ -1,6 +1,10 @@
import pytest
from ra_aid.tools.programmer import parse_aider_flags, run_programming_task
from ra_aid.tools.programmer import (
get_aider_executable,
parse_aider_flags,
run_programming_task,
)
# Test cases for parse_aider_flags function
test_cases = [
@ -33,6 +37,37 @@ test_cases = [
),
("--yes-always", ["--yes-always"], "single flag with dashes"),
("yes-always", ["--yes-always"], "single flag without dashes"),
# New test cases for flags with values
(
"--analytics-log filename.json",
["--analytics-log", "filename.json"],
"flag with value",
),
(
"--analytics-log filename.json, --model gpt4",
["--analytics-log", "filename.json", "--model", "gpt4"],
"multiple flags with values",
),
(
"--dark-mode, --analytics-log filename.json",
["--dark-mode", "--analytics-log", "filename.json"],
"mix of simple flags and flags with values",
),
(
" --dark-mode , --model gpt4 ",
["--dark-mode", "--model", "gpt4"],
"flags with values and extra whitespace",
),
(
"--analytics-log filename.json",
["--analytics-log", "filename.json"],
"multiple spaces between flag and value",
),
(
"---dark-mode,----model gpt4",
["--dark-mode", "--model", "gpt4"],
"stripping extra dashes",
),
]
@ -56,9 +91,103 @@ def test_aider_config_flag(mocker):
"ra_aid.tools.programmer.run_interactive_command", return_value=(b"", 0)
)
run_programming_task("test instruction")
run_programming_task.invoke({"instructions": "test instruction"})
args = mock_run.call_args[0][0] # Get the first positional arg (command list)
assert "--config" in args
config_index = args.index("--config")
assert args[config_index + 1] == "/path/to/config.yml"
def test_path_normalization_and_deduplication(mocker, tmp_path):
"""Test path normalization and deduplication in run_programming_task."""
# Create a temporary test file
test_file = tmp_path / "test.py"
test_file.write_text("")
new_file = tmp_path / "new.py"
# Mock dependencies
mocker.patch("ra_aid.tools.programmer._global_memory", {"related_files": {}})
mocker.patch(
"ra_aid.tools.programmer.get_aider_executable", return_value="/path/to/aider"
)
mock_run = mocker.patch(
"ra_aid.tools.programmer.run_interactive_command", return_value=(b"", 0)
)
# Test duplicate paths
run_programming_task.invoke(
{
"instructions": "test instruction",
"files": [str(test_file), str(test_file)], # Same path twice
}
)
# Get the command list passed to run_interactive_command
cmd_args = mock_run.call_args[0][0]
# Count occurrences of test_file path in command
test_file_count = sum(1 for arg in cmd_args if arg == str(test_file))
assert test_file_count == 1, "Expected exactly one instance of test_file path"
# Test mixed paths
run_programming_task.invoke(
{
"instructions": "test instruction",
"files": [str(test_file), str(new_file)], # Two different paths
}
)
# Get the command list from the second call
cmd_args = mock_run.call_args[0][0]
# Verify both paths are present exactly once
assert (
sum(1 for arg in cmd_args if arg == str(test_file)) == 1
), "Expected one instance of test_file"
assert (
sum(1 for arg in cmd_args if arg == str(new_file)) == 1
), "Expected one instance of new_file"
def test_get_aider_executable(mocker):
"""Test the get_aider_executable function."""
mock_sys = mocker.patch("ra_aid.tools.programmer.sys")
mock_path = mocker.patch("ra_aid.tools.programmer.Path")
mock_os = mocker.patch("ra_aid.tools.programmer.os")
# Mock sys.executable and platform
mock_sys.executable = "/path/to/venv/bin/python"
mock_sys.platform = "linux"
# Mock Path().parent and exists()
mock_path_instance = mocker.MagicMock()
mock_path.return_value = mock_path_instance
mock_parent = mocker.MagicMock()
mock_path_instance.parent = mock_parent
mock_aider = mocker.MagicMock()
mock_parent.__truediv__.return_value = mock_aider
mock_aider.exists.return_value = True
# Mock os.access to return True
mock_os.access.return_value = True
mock_os.X_OK = 1 # Mock the execute permission constant
# Test happy path on Linux
aider_path = get_aider_executable()
assert aider_path == str(mock_aider)
mock_parent.__truediv__.assert_called_with("aider")
# Test Windows path
mock_sys.platform = "win32"
aider_path = get_aider_executable()
mock_parent.__truediv__.assert_called_with("aider.exe")
# Test executable not found
mock_aider.exists.return_value = False
with pytest.raises(RuntimeError, match="Could not find aider executable"):
get_aider_executable()
# Test not executable
mock_aider.exists.return_value = True
mock_os.access.return_value = False
with pytest.raises(RuntimeError, match="is not executable"):
get_aider_executable()

View File

@ -464,6 +464,35 @@ def test_related_files_formatting(reset_memory, tmp_path):
assert get_memory_value("related_files") == ""
def test_emit_related_files_path_normalization(reset_memory, tmp_path):
"""Test that emit_related_files fails to detect duplicates with non-normalized paths"""
# Create a test file
test_file = tmp_path / "file.txt"
test_file.write_text("test content")
# Change to the temp directory so relative paths work
import os
original_dir = os.getcwd()
os.chdir(tmp_path)
try:
# Add file with absolute path
result1 = emit_related_files.invoke({"files": ["file.txt"]})
assert "File ID #0:" in result1
# Add same file with relative path - should get same ID due to path normalization
result2 = emit_related_files.invoke({"files": ["./file.txt"]})
assert "File ID #0:" in result2 # Should reuse ID since it's the same file
# Verify only one normalized path entry exists
assert len(_global_memory["related_files"]) == 1
assert os.path.abspath("file.txt") in _global_memory["related_files"].values()
finally:
# Restore original directory
os.chdir(original_dir)
def test_key_snippets_integration(reset_memory, tmp_path):
"""Integration test for key snippets functionality"""
# Create test files

View File

@ -3,7 +3,7 @@ from unittest.mock import patch
import pytest
from ra_aid.tools.write_file import write_file_tool
from ra_aid.tools.write_file import put_complete_file_contents
@pytest.fixture
@ -19,7 +19,9 @@ def test_basic_write_functionality(temp_test_dir):
test_file = temp_test_dir / "test.txt"
content = "Hello, World!\nTest content"
result = write_file_tool.invoke({"filepath": str(test_file), "content": content})
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": content}
)
# Verify file contents
assert test_file.read_text() == content
@ -29,7 +31,8 @@ def test_basic_write_functionality(temp_test_dir):
assert result["success"] is True
assert result["filepath"] == str(test_file)
assert result["bytes_written"] == len(content.encode("utf-8"))
assert "Operation completed" in result["message"]
assert "Successfully wrote" in result["message"]
assert "bytes" in result["message"]
def test_directory_creation(temp_test_dir):
@ -38,7 +41,9 @@ def test_directory_creation(temp_test_dir):
test_file = nested_dir / "test.txt"
content = "Test content"
result = write_file_tool.invoke({"filepath": str(test_file), "content": content})
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": content}
)
assert test_file.exists()
assert test_file.read_text() == content
@ -51,15 +56,23 @@ def test_different_encodings(temp_test_dir):
content = "Hello 世界" # Mixed ASCII and Unicode
# Test UTF-8
result_utf8 = write_file_tool.invoke(
{"filepath": str(test_file), "content": content, "encoding": "utf-8"}
result_utf8 = put_complete_file_contents(
{
"filepath": str(test_file),
"complete_file_contents": content,
"encoding": "utf-8",
}
)
assert result_utf8["success"] is True
assert test_file.read_text(encoding="utf-8") == content
# Test UTF-16
result_utf16 = write_file_tool.invoke(
{"filepath": str(test_file), "content": content, "encoding": "utf-16"}
result_utf16 = put_complete_file_contents(
{
"filepath": str(test_file),
"complete_file_contents": content,
"encoding": "utf-16",
}
)
assert result_utf16["success"] is True
assert test_file.read_text(encoding="utf-16") == content
@ -71,8 +84,8 @@ def test_permission_error(mock_open_func, temp_test_dir):
mock_open_func.side_effect = PermissionError("Permission denied")
test_file = temp_test_dir / "noperm.txt"
result = write_file_tool.invoke(
{"filepath": str(test_file), "content": "test content"}
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": "test content"}
)
assert result["success"] is False
@ -86,8 +99,8 @@ def test_io_error(mock_open_func, temp_test_dir):
mock_open_func.side_effect = IOError("IO Error occurred")
test_file = temp_test_dir / "ioerror.txt"
result = write_file_tool.invoke(
{"filepath": str(test_file), "content": "test content"}
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": "test content"}
)
assert result["success"] is False
@ -99,12 +112,26 @@ def test_empty_content(temp_test_dir):
"""Test writing empty content to a file."""
test_file = temp_test_dir / "empty.txt"
result = write_file_tool.invoke({"filepath": str(test_file), "content": ""})
result = put_complete_file_contents({"filepath": str(test_file)})
assert test_file.exists()
assert test_file.read_text() == ""
assert result["success"] is True
assert result["bytes_written"] == 0
assert "initialized empty file" in result["message"].lower()
def test_write_empty_file_default(temp_test_dir):
"""Test creating an empty file using default parameter."""
test_file = temp_test_dir / "empty_default.txt"
result = put_complete_file_contents({"filepath": str(test_file)})
assert test_file.exists()
assert test_file.read_text() == ""
assert result["success"] is True
assert result["bytes_written"] == 0
assert "initialized empty file" in result["message"].lower()
def test_overwrite_existing_file(temp_test_dir):
@ -116,8 +143,8 @@ def test_overwrite_existing_file(temp_test_dir):
# Overwrite with new content
new_content = "New content"
result = write_file_tool.invoke(
{"filepath": str(test_file), "content": new_content}
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": new_content}
)
assert test_file.read_text() == new_content
@ -130,7 +157,9 @@ def test_large_file_write(temp_test_dir):
test_file = temp_test_dir / "large.txt"
content = "Large content\n" * 1000 # Create substantial content
result = write_file_tool.invoke({"filepath": str(test_file), "content": content})
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": content}
)
assert test_file.exists()
assert test_file.read_text() == content
@ -143,8 +172,8 @@ def test_invalid_path_characters(temp_test_dir):
"""Test handling of invalid path characters."""
invalid_path = temp_test_dir / "invalid\0file.txt"
result = write_file_tool.invoke(
{"filepath": str(invalid_path), "content": "test content"}
result = put_complete_file_contents(
{"filepath": str(invalid_path), "complete_file_contents": "test content"}
)
assert result["success"] is False
@ -161,8 +190,8 @@ def test_write_to_readonly_directory(temp_test_dir):
os.chmod(readonly_dir, 0o444)
try:
result = write_file_tool.invoke(
{"filepath": str(test_file), "content": "test content"}
result = put_complete_file_contents(
{"filepath": str(test_file), "complete_file_contents": "test content"}
)
assert result["success"] is False
assert "Permission" in result["message"]

View File

@ -56,6 +56,102 @@ def sample_git_repo(empty_git_repo):
return empty_git_repo
@pytest.fixture
def git_repo_with_untracked(sample_git_repo):
"""Create a git repository with both tracked and untracked files."""
# Create untracked files
untracked_files = ["untracked.txt", "src/untracked.py", "docs/draft.md"]
for file_path in untracked_files:
full_path = sample_git_repo / file_path
full_path.parent.mkdir(parents=True, exist_ok=True)
full_path.write_text(f"Untracked content of {file_path}")
return sample_git_repo
@pytest.fixture
def git_repo_with_ignores(git_repo_with_untracked):
"""Create a git repository with .gitignore rules."""
# Create .gitignore file
gitignore_content = """
# Python
__pycache__/
*.pyc
# Project specific
*.log
temp/
ignored.txt
docs/draft.md
"""
gitignore_path = git_repo_with_untracked / ".gitignore"
gitignore_path.write_text(gitignore_content)
# Add and commit .gitignore first
subprocess.run(["git", "add", ".gitignore"], cwd=git_repo_with_untracked)
subprocess.run(
["git", "commit", "-m", "Add .gitignore"],
cwd=git_repo_with_untracked,
env={
"GIT_AUTHOR_NAME": "Test",
"GIT_AUTHOR_EMAIL": "test@example.com",
"GIT_COMMITTER_NAME": "Test",
"GIT_COMMITTER_EMAIL": "test@example.com",
},
)
# Create some ignored files
ignored_files = [
"ignored.txt",
"temp/temp.txt",
"src/__pycache__/main.cpython-39.pyc",
]
for file_path in ignored_files:
full_path = git_repo_with_untracked / file_path
full_path.parent.mkdir(parents=True, exist_ok=True)
full_path.write_text(f"Ignored content of {file_path}")
return git_repo_with_untracked
@pytest.fixture
def git_repo_with_aider_files(sample_git_repo):
"""Create a git repository with .aider files that should be ignored."""
# Create .aider files
aider_files = [
".aider.chat.history.md",
".aider.input.history",
".aider.tags.cache.v3/some_file",
"src/.aider.local.settings",
]
# Create regular files
regular_files = ["main.cpp", "src/helper.cpp"]
# Create all files
for file_path in aider_files + regular_files:
full_path = sample_git_repo / file_path
full_path.parent.mkdir(parents=True, exist_ok=True)
full_path.write_text(f"Content of {file_path}")
# Add all files (both .aider and regular) to git
subprocess.run(["git", "add", "."], cwd=sample_git_repo)
subprocess.run(
["git", "commit", "-m", "Add files including .aider"],
cwd=sample_git_repo,
env={
"GIT_AUTHOR_NAME": "Test",
"GIT_AUTHOR_EMAIL": "test@example.com",
"GIT_COMMITTER_NAME": "Test",
"GIT_COMMITTER_EMAIL": "test@example.com",
},
)
return sample_git_repo
def test_is_git_repo(sample_git_repo, tmp_path_factory):
"""Test git repository detection."""
# Create a new directory that is not a git repository
@ -248,39 +344,200 @@ def mock_is_git_repo():
yield mock
@pytest.fixture
def mock_os_path(monkeypatch):
"""Mock os.path functions."""
def mock_exists(path):
return True
def mock_isdir(path):
return True
monkeypatch.setattr(os.path, "exists", mock_exists)
monkeypatch.setattr(os.path, "isdir", mock_isdir)
return monkeypatch
@pytest.mark.parametrize("test_case", FILE_LISTING_TEST_CASES, ids=lambda x: x["name"])
def test_get_file_listing(test_case, mock_subprocess, mock_is_git_repo):
def test_get_file_listing(test_case, mock_subprocess, mock_is_git_repo, mock_os_path):
"""Test get_file_listing with various inputs."""
mock_subprocess.return_value = create_mock_process(test_case["git_output"])
files, total = get_file_listing(DUMMY_PATH, limit=test_case["limit"])
assert files == test_case["expected_files"]
assert total == test_case["expected_total"]
def test_get_file_listing_non_git_repo(mock_is_git_repo):
def test_get_file_listing_non_git_repo(mock_is_git_repo, mock_os_path):
"""Test get_file_listing with non-git repository."""
mock_is_git_repo.return_value = False
files, total = get_file_listing(DUMMY_PATH)
assert files == EMPTY_FILE_LIST
assert total == EMPTY_FILE_TOTAL
assert files == []
assert total == 0
def test_get_file_listing_git_error(mock_subprocess, mock_is_git_repo):
def test_get_file_listing_git_error(mock_subprocess, mock_is_git_repo, mock_os_path):
"""Test get_file_listing when git command fails."""
mock_subprocess.side_effect = GitCommandError("Git command failed")
with pytest.raises(GitCommandError):
get_file_listing(DUMMY_PATH)
def test_get_file_listing_permission_error(mock_subprocess, mock_is_git_repo):
def test_get_file_listing_permission_error(
mock_subprocess, mock_is_git_repo, mock_os_path
):
"""Test get_file_listing with permission error."""
mock_subprocess.side_effect = PermissionError("Permission denied")
with pytest.raises(DirectoryAccessError):
get_file_listing(DUMMY_PATH)
def test_get_file_listing_unexpected_error(mock_subprocess, mock_is_git_repo):
def test_get_file_listing_unexpected_error(
mock_subprocess, mock_is_git_repo, mock_os_path
):
"""Test get_file_listing with unexpected error."""
mock_subprocess.side_effect = Exception("Unexpected error")
with pytest.raises(FileListerError):
get_file_listing(DUMMY_PATH)
def test_get_file_listing_with_untracked(git_repo_with_untracked):
"""Test that file listing includes both tracked and untracked files."""
files, count = get_file_listing(str(git_repo_with_untracked))
# Check tracked files are present
assert "README.md" in files
assert "src/main.py" in files
# Check untracked files are present
assert "untracked.txt" in files
assert "src/untracked.py" in files
# Verify count includes both tracked and untracked
expected_count = 8 # 5 tracked + 3 untracked (excluding .gitignore)
assert count == expected_count
def test_get_file_listing_with_untracked_and_limit(git_repo_with_untracked):
"""Test that file listing with limit works correctly with untracked files."""
limit = 3
files, count = get_file_listing(str(git_repo_with_untracked), limit=limit)
# Total count should still be full count
assert count == 8 # 5 tracked + 3 untracked (excluding .gitignore)
# Only limit number of files should be returned
assert len(files) == limit
# Files should be sorted, so we can check first 3
assert files == sorted(files)
def test_get_file_listing_respects_gitignore(git_repo_with_ignores):
"""Test that file listing respects .gitignore rules."""
# First test with hidden files excluded (default)
files, count = get_file_listing(str(git_repo_with_ignores))
# These files should be included (tracked or untracked but not ignored)
assert "README.md" in files
assert "src/main.py" in files
assert "untracked.txt" in files
assert "src/untracked.py" in files
# .gitignore should be excluded as it's hidden
assert ".gitignore" not in files
# These files should be excluded (ignored)
assert "ignored.txt" not in files
assert "temp/temp.txt" not in files
assert "src/__pycache__/main.cpython-39.pyc" not in files
assert "docs/draft.md" not in files # Explicitly ignored in .gitignore
# Count should include non-ignored, non-hidden files
expected_count = 7 # 4 tracked + 2 untracked (excluding .gitignore)
assert count == expected_count
# Now test with hidden files included
files, count = get_file_listing(str(git_repo_with_ignores), include_hidden=True)
# .gitignore should now be included
assert ".gitignore" in files
# Count should include non-ignored files plus .gitignore
expected_count = 8 # 5 tracked + 2 untracked + .gitignore
assert count == expected_count
def test_aider_files_excluded(git_repo_with_aider_files):
"""Test that .aider files are excluded from the file listing."""
files, count = get_file_listing(str(git_repo_with_aider_files))
# Regular files should be included
assert "main.cpp" in files
assert "src/helper.cpp" in files
# .aider files should be excluded
assert ".aider.chat.history.md" not in files
assert ".aider.input.history" not in files
assert ".aider.tags.cache.v3/some_file" not in files
assert "src/.aider.local.settings" not in files
# Only the regular files should be counted
expected_count = 7 # 5 original files from sample_git_repo + 2 new regular files
assert count == expected_count
assert len(files) == expected_count
def test_hidden_files_excluded_by_default(git_repo_with_aider_files):
"""Test that hidden files are excluded by default."""
# Create some hidden files
hidden_files = [".config", ".env", "src/.local", ".gitattributes"]
# Create regular files
regular_files = ["main.cpp", "src/helper.cpp"]
# Create all files
for file_path in hidden_files + regular_files:
full_path = git_repo_with_aider_files / file_path
full_path.parent.mkdir(parents=True, exist_ok=True)
full_path.write_text(f"Content of {file_path}")
# Add all files to git
subprocess.run(["git", "add", "."], cwd=git_repo_with_aider_files)
subprocess.run(
["git", "commit", "-m", "Add files including hidden files"],
cwd=git_repo_with_aider_files,
env={
"GIT_AUTHOR_NAME": "Test",
"GIT_AUTHOR_EMAIL": "test@example.com",
"GIT_COMMITTER_NAME": "Test",
"GIT_COMMITTER_EMAIL": "test@example.com",
},
)
# Test default behavior (hidden files excluded)
files, count = get_file_listing(str(git_repo_with_aider_files))
# Regular files should be included
assert "main.cpp" in files
assert "src/helper.cpp" in files
# Hidden files should be excluded
for hidden_file in hidden_files:
assert hidden_file not in files
# Only regular files should be counted
assert count == 7 # 5 original files + 2 new regular files
# Test with include_hidden=True
files, count = get_file_listing(str(git_repo_with_aider_files), include_hidden=True)
# Both regular and hidden files should be included
assert "main.cpp" in files
assert "src/helper.cpp" in files
for hidden_file in hidden_files:
assert hidden_file in files
# All files should be counted
assert count == 11 # 5 original + 2 regular + 4 hidden