First, let's talk about the current observed situation. With the popularity of LLM, there are now many alternative solutions to ChatGPT. It can be felt that many of OpenAI's users have been diverted, resulting in a recent decline in OpenAI's risk control. However, for users in China, there are still two high walls when it comes to payment.
However, for Chinese users, ChatGPT is still the best choice.
On June 1st, a large number of API quotas will expire, including mine, so now I must find a relatively hassle-free solution.
First, let's understand several popular large models currently available.
ChatGPT#
The advantages of ChatGPT are quite obvious. It is still the best choice for now. If you can afford it, try to pay for it. The accuracy of the web version is better than the API, possibly because the web version has a longer default system prompt.
GPT-3.5 no longer has a significant advantage at the moment, but it is indeed faster. GPT-4 definitely still has many advantages, but there are also many restrictions and high costs.
In terms of payment, OpenAI recently launched an iOS app, which allows users to subscribe to Plus directly through the App Store. This payment method is the smoothest for users of the web version.
You can purchase gift cards through Alipay, then recharge them to a US account and subscribe. The web version is also shared.
As for the API, there is currently no good solution. Some people say that they have successfully bound it with a domestic full currency card, which may reduce the risk control, but it is still a small probability and requires a clean IP.
If the shell UI used supports the web model, you can directly use the Access Token method, as long as the network can log in smoothly (if you can't log in, you can also use the solution provided by fakeopen.com, as long as you have an account).
PS: Recently, many people have reported that their previously banned OpenAI accounts have been unbanned. If your account has been banned, you can try logging in again.
Azure OpenAI#
If you are using the API, Azure's solution should be good. The previous review process was very strict, but now it seems to have relaxed a lot, probably because users have really diverted to other platforms.
A few days ago, I applied again and got approved the next day. Many of those who were asked to provide materials for their previous applications were directly approved. Those who have applied can check their emails.
After getting approved, remember to apply for GPT-4. I am currently still in the queue.
Currently, many shell UIs still do not directly support Azure, but there are many proxies on GitHub, such as azure-openai-proxy.
Don't have a server? You can use CF's Works for free, cf-openai-azure-proxy.
NewBing#
Based on current feedback, the performance is average. For Chinese users, most of them go to search on Zhihu and CSDN. It seems a bit unintelligent to ask everything.
No need to apply, use it directly.
Claude#
The latest update supports 100k, and the context is indeed good. The generated quality is also good. The API is a bit difficult to apply for, but it can be used directly on the web. ChatHub can also be used directly.
No need to apply, use it directly.
It can be considered as an alternative.
Bard#
From Google, the experience is okay. It can be used as a networked version of ChatGPT, but unfortunately, it does not support Chinese at the moment. It can basically replace ChatGPT3.5 for other languages.
No need to apply, use it directly.
Poe#
Similar to ChatHub, an all-in-one solution that I introduced before. It can be used as an alternative backup. However, the content rendering is a bit slow.
It can be used for free or paid through the App Store for GPT-4.
iFLYTEK Starfire#
The only domestically developed model with online intelligence. However, compared to ChatGPT, it still has some gaps, but at least it can be used. It is suitable for use in environments where it is inconvenient to use magic to access the internet in China.
It requires an application, but it is easy to pass.
Third-party proxy solutions#
If you still decide to use ChatGPT and don't want to bother with payment, then you need to consider using a proxy.
Currently, there are many modified shell UIs that provide solutions based on account points and other rules. This method should be the most common, but personally, I wouldn't choose it.
Another way is to create a proxy URL and sell API keys. In short, it is not much different from the official method. The fees are also charged based on request tokens, and they are generally more expensive than the official pricing. With the API key you registered and the corresponding proxy URL, you can use it anywhere that supports the official API, similar to setting up our own proxy. I can accept this kind of solution.
Currently, several service providers I know of are:
- openai-sb [recommended]
- CloseAI (openai-asia)
- api2d
This method has an advantage, which is that it can usually be directly connected in China without the need for magic to access the internet.
There are also some stores that directly sell OpenAI keys, and selling accounts also falls into this category because it is indeed not easy to register an OpenAI account now.
For users who are concerned about data security risks, this kind of solution can be considered.
In recent days, I also discovered a magical project called Pandora. In simple terms, it solves various error problems when using ChatGPT on the web, and it can be used directly without any issues. The official provides a demo, and there are also shared accounts available. It is like a mirror website for the web version, and the experience is quite good.
After the expiration, I will most likely choose the openai-sb (cheap) + Pandora + Azure OpenAI solution.