Have you grabbed the iPhone 15 Pro? According to information from Apple’s official website, the earliest delivery date for iPhone 15 Pro Max has been extended to early November. Although the first batch of pre-sale data has not yet been announced, the number of reservations for the iPhone 15 series on JD.com alone has already exceeded3 million.

access:

Apple Online Store (China)


Apple official website sales information

Tianfeng Securities analyst Ming-Chi Kuo just wrote:The sales of iPhone 15 Pro Max once again exceeded last year's 14 Pro Max. As for Huawei's impact on Apple's sales, he believes that "it still needs to be observed." His estimate for the iPhone 15 series this year remains at 80 million units.Huawei’s mobile phone sales forecast for 2024 is approximately 60 million units.


During this launch, many people lamented Apple’s performance:I have never won online, and I have never lost in sales.

Is the iPhone 15 series strong, good, innovative, and worth the price?

Faced with Huawei's "far lead", is Apple really just updating its "serial number" as some people say?

Looking at Apple’s autumn and spring gala,Including CEO CookTwelve Apple executives spent 82 minutes presenting 7 products and 4 chips, while also unveiling the mystery of a series of long-awaited new technologies.

But what exactly do these new technologies mean to consumers and the industry? How many of them are "way ahead", how many of them are "keeping up with Android and not falling behind", and how many of them are true "hard-core innovations"?

By digging deep into technical details, patents and communicating with industry insiders, we tried to answer 9 key questions to restore a more realistic iPhone 15 series and try to find more valuable industry observations.


Under the in-depth "teardown", we found that many details are worth thinking about:

1. How do interfaces and professional imaging upgrades that change every ten years change the creative industry?

2. Why is the titanium alloy middle frame "titanium clad aluminum"? How to combine the middle frame and screen technology to achieve the narrowest frame?

3. What kind of black technology is hidden in the "Periscope" that breaks all previous Android solutions?

4. Why is Apple’s thinking and positioning of telephoto shooting completely opposite to that of the Android camp?

5. What potential plans does Apple have behind the change in chip naming?

6. Will GPU support for light tracing and MetalFX accelerate the integration of the two major ecosystems in the game industry?

7. What imagination space is there for the deep linkage between mobile watches and VisionPro?

8. Will 3D space video shooting become the next trend in the image creation industry?

9. In the era of big AI models, what has Apple done to pave the way for potential “AI big moves”?

Behind every problem, there are often some detailed upgrades that are enough to leverage an emerging industry, fromFrom consumer electronics, film and television creation, game development to chip semiconductors and artificial intelligence.Put aside the cloak of "Tucao Conference" and calm down, we may still be able to discover some of Apple's "disruptive innovations."


The new iPhone will officially go on sale this Friday. Whether you pay or not, it’s better to be clear about some things.

01.

The C port opens up device connections, eliminating storage bottlenecks. It may be combined with a head-mounted display or have more applications.

One of the most discussed new features is the replacement of the USB-C port. The last time Apple updated its mobile phone port was in 2012.

The first iPhone released by Apple in 2007 used a 30-pin interface, and the front and back of the interface needed to be distinguished. During the same period, Android used a MicroUSB interface; five years later, Apple replaced the Lightning interface on the iPhone 5 in 2012, and it has been in use for 11 years.

I don’t know how long USB-C can last this time.

After the press conference, many people hung out the 243 yuan USB-C to Lightning interface advertised on Apple’s official website.


butRather than focusing on making money from the Apple accessories business, what deserves more attention is obviously what this C-port can do.

Is the increase in transmission rate really just faster? In fact, for ordinary users, we rarely use USB cables to transfer data.

When 3G was replaced by 4G, many people did not expect that the food delivery and online ride-hailing industries would be spawned later.

Actually,This USB-C port is a "rigid necessity" for many users who have high-quality video shooting and storage needs.

This C port not only has a data transmission speed of 10Gb/s, it can connect high-speed external storage devices such as mobile hard drives and USB flash drives, as well as 4K monitors and microphones.

The most intuitive change is that this solves the problem of insufficient storage space when shooting ProRes videos. Some professionals commented on this,“Being able to save ProRes videos shot in real time on iPhone directly to an external hard drive is definitely a cool operation.”

A clip from Apple’s official promotional video is a good example of the convenience brought by external storage:


PS: Friendly Samsung’s mobile hard drives are much cheaper than Apple’s official storage.

In addition, Apple has also opened the frame rate limit for ProRes4K video shooting from 30fps to 60fps. At the same time, the iPhone can be connected to a Mac computer through the C port, and the captured footage can be edited on the Mac to streamline the process.


It is true that most professional video creations still require professional cameras, but with the development of mobile imaging so far, taking the video shooting capabilities of iPhone14Pro as an example, it can already meet the needs of many independent creators and photography enthusiasts.


For users who use iPhone to create videos, the C port has liberated them from some creative limitations to a certain extent.

Some professional creatives even believe that this may be the upgrade that has the greatest impact on them in the iPhone 15.

Of course, Apple’s VisionPro is not excluded from the devices that can be connected to the C port.

At the press conference, Apple also mentioned using iPhone 15 Pro to shoot spatial videos and play them directly on Vision Pro. The storage requirements of spatial videos are still unclear, but it is foreseeable that if a certain function in Apple Vision Pro requires efficient collaboration between mobile phones and headsets in the future, involving high-speed and low-latency transmission of data, this C-port upgrade has obviously also laid the groundwork.

soIt seems that Apple just changed the interface, but at the practical application level, Apple has indeed come up with some different ways of playing.

But having said that, although the C port has been upgraded, the 20W charging speed has not changed, so there is really nothing to wash about.

02.

A middle frame can make "four layers", and new processes are quietly used. Xiaomi OVs must follow suit

Anyone who has watched the press conference may have the impression that Apple spent a lot of time introducing the titanium alloy middle frame this time, and even invited Yang, an Apple materials science engineer, to explain it.


Everyone knows that titanium is a very good metal, but actually putting it into the middle frame of mobile phones may not be as easy as imagined. The fact is that Apple was the first to use titanium alloy in the middle frame of mobile phones on a large scale.

How are apples processed? Judging from the demonstration video, Apple has implemented a technology called "solid-state diffusion", using thermomechanical processing technology to join titanium metal and aluminum metal together at ultra-high pressure. On the joint surface, the two metals "diffuse" each other at the atomic level.


This process technology is the first of its kind in the smartphone industry.

So strictly speaking, Apple's titanium alloy middle frame has a "titanium clad aluminum" structure. In Apple's words, this aspect helps heat conduction. At the same time, the metal aluminum is recycled material, which also meets Apple's environmental protection needs.

Why is it beneficial to heat conduction? Let’s review some physical knowledge here: The thermal conductivity of pure titanium is λ = 15.24W/(m·K), which is about 1/14 of aluminum, while the thermal conductivity of titanium alloy is 15% lower than that of pure titanium.

So it wanted to be strong, light and have good heat dissipation. Apple finally came up with this kind of "matryoshka doll" process.

In fact, compared to being "sturdier", this middle frame has two functions that are easily overlooked. One is to make the phone lighter, and the other is to cooperate with the screen packaging technology to achieve a narrower frame.

The iPhone 15 Pro weighs 187 grams this time, and the previous generation iPhone 14 Pro weighs 206 grams. The 6.7-inch iPhone 15 Pro Max is 20 grams lighter than the previous generation after replacing it with a larger 5x telephoto lens.


Joswiak, Apple's senior vice president of global marketing, said at the press conference that Apple has made the phone smaller in size while keeping the screen size unchanged. In fact, few people have noticed that the vertical length of the iPhone 15 Pro is 0.9 mm shorter and the width is 0.9 mm narrower.

That is to say,When you pick up the Pro version, you will not only feel that it is obviously lighter, but you may also feel that it is slightly smaller and has a better grip.


If we simply calculate that the screen size remains unchanged, then the screen frame must be at least 0.45 mm narrower.

As Joswiak said, this is their "lightest Pro" to date, and it's also the iPhone with the narrowest bezels to date.


Bloomberg previously reported that Apple may have used a technology called "LIPO (low-pressure injection molding)" to narrow the bezels, but Apple did not reveal exactly how they achieved it at the press conference.

It is worth noting that the new process of this titanium alloy middle frame is not only about processing titanium alloy. Careful people can find that compared with last year’s stainless steel middle frame, this year’s titanium alloy middle frame has a brushed texture.


Apple said that they used multiple processes such as precision machining, grinding, wire drawing, and sandblasting to create this "brush mark texture." In addition, on top of the texture, Apple applied a PVD coating for final protection. The precision of this coating is at the nanometer level.Apple says this "coating" process takes 14 hours.

The addition of the brushing process seems to be a very detailed matter, but it effectively solves the problem of the metal middle frame being easily contaminated by fingerprints. Apple is indeed solving the problem of daily feedback from users bit by bit.

It simply looks like a titanium alloy middle frame, but if you look at it this way, this middle frame has at least the "four layers" mentioned above: PVD, brushed texture, titanium, and aluminum.


Behind this, the industry-side process improvements brought about by Apple's scale effect seem to be soon applied to the products of other smartphone manufacturers.

According to industry information, there is a high probability that all domestic flagship phones will make a fuss about "new materials" by the end of the year. For example, Xiaomi will also use titanium alloy technology in its future flagship products.

03.

How on earth do you fit seven lenses into a mobile phone? Apple Imaging embraces “professionalism”

After talking about the two most obvious changes, let’s take a look at the photography part.

At first glance, the appearance of the three cameras of this year’s iPhone 15 Pro and Pro Max seems to be the same as last year’s. It has not become larger, and the layout is still an equilateral triangle.


But Apple concluded at the press conference that one of the core things they did this time was to put seven professional lenses into a mobile phone.


Many people lament that the appearance of the iPhone has not changed much in recent years, but from another perspective, this seems to be exactly one thing that Apple is good at:In a body whose size is almost unchanged or even slightly reduced, more new features are put into the product while ensuring the completeness and consistency of the product experience.

There is a detail that Apple did not mention at the press conference that attracted the attention of many people in the industry:Of Apple’s “seven lenses”, some of the focal lengths are digitally cropped, but the shooting functions of all focal lengths are perfect and consistent.For example, the autofocus function of portrait mode can be implemented in various focal length options.


According to industry insiders,It is very difficult to achieve such consistency. Currently, no manufacturer in the Android camp can achieve the same experience.


In addition, Apple mentioned that this time the 48-megapixel main camera has a new function. Apple's light imaging engine will select the best pixel part from an ultra-high-resolution image, and then merge it with another image optimized for capturing light to automatically generate a 24-megapixel photo with twice the resolution.


Although currently the main cameras of flagship phones in the Android camp basically start with 50 million pixels, and 100 million pixels or even 200 million pixels are not uncommon. Manufacturers will also emphasize the large sensor size and high pixels at press conferences, butWhen we actually get it to take pictures, the default photos are usually still 12 million pixels.

The four-in-one pixels and nine-in-one pixels that we often hear are doing just this.

For example, the resolution of a common photo taken by a 50-megapixel main camera is usually 4096*3072, and the number of pixels is 12.58 million. When multiplied by 4, it is approximately equal to 50 million.


Detailed information about the main camera of an Android flagship phone

If you don’t know what the specifications of your mobile phone are for taking pictures, you might as well open the photo album of your mobile phone to check the detailed information of the photos, and you will have a more intuitive understanding.

So Apple’s 24 million pixels, although they don’t sound “scary”, are actually clear enough. How clear is it? The number of screen pixels of a 4K monitor is approximately 8.23 ​​million.One of Apple's photos can cover about three 4K monitors pixel by pixel.

In addition, how to complete the fusion of an ultra-high-resolution image and another light information image at the moment of shooting, and finally achieve 24 million pixel imaging, the specific implementation method is also worthy of further exploration.

It is worth mentioning that Apple supports 48-megapixel photos to be stored in HEIF format this time, which takes up less storage space while maintaining the same photo clarity. According to public information,The same photo is stored in HEIF format, which saves about 50-60% of space compared with traditional JPG, JPEG and other formats.


Comparison of storage space occupied by HEIF and JPEG format photos (the default file extension of HEIF format is .HEIF or .HEIC)

When it comes to taking photos this time, the improvement of “professionalism” is a point that is easily overlooked. In fact, Apple is constantly pushing the iPhone in the direction of “professional photography”.

For example, in terms of professional photography, Apple has added a Log mode encoding function based on ProRes, which increases the space and flexibility of post-production visual effects and color correction.

The color management of Apple devices has always been the "ceiling" in the field of consumer electronics. This time, the iPhone 15 Pro is the first product in the mobile phone field to support the Academy Color Coding System (ACES). ACES is a professional color management and interaction system in the field of film production and is used by many mainstream film production companies.


A professional worker in the film and television production field told Zhidongzhi that, in fact,At present, the video shooting capabilities of iPhone14Pro can meet almost all shooting needs.It can realize any creative ideas of the creator, which is beyond the reach of many ordinary users.


Combined with the high speed and external device capabilities brought by USB-C mentioned in the interface section above, the iPhone’s improvement in professional image creation capabilities can be said to be a big move ignored by most ordinary users.

Finally, regarding the shooting of space videos, this function also verified what many people suspected three months ago.

The linkage between Apple Vision Pro and iPhone is inevitable, and the linkage between Vision Pro and Apple ecological devices such as Mac, Apple Watch, and AirPods has also become an established fact.


The principle of shooting 3D space video is not difficult to understand. Just like when we cover one eye, the world becomes two-dimensional. One of the most basic elements of 3D imaging is the participation of two cameras.

When we hold the phone horizontally, the main camera and ultra-wide-angle lens just form a "binocular field of view" to shoot 3D space videos.


When VisionPro was released, many people complained that it would be "weird" to shoot videos with a headset. Now the answer has been revealed. It is still possible to shoot 3D space videos with a mobile phone, the most natural way.

We can’t help but imagine whether video creation will also shift from flat to 3D in the future, which seems to be brewing a new storm for the film and television creation industry.

It can be said that Apple has opened a small door this time, allowing us to see a simple side of the linkage between Vision Pro and iPhone. More room for imagination is left for the official release early next year.

04.

Apple cares more about 120mm than 120x. The internal structure of the telephoto lens hides secrets

After talking about the main camera, let’s take a look at the new telephoto lens on ProMax, which is the “periscope lens” mentioned in many reports.

One interesting thing about this lens is Apple’s positioning of it.

Compared with the "100x zoom" war we often see in the Android camp, 100x and 120x zoom, Apple values ​​​​more "120mm".


In Apple’s words, you are equivalent to carrying an extra “120mm focal length lens” with you, and Apple believes that 120mm is a very practical focal length. This is completely contrary to our previous impression of periscope lenses.

Compared with taking pictures of a window on a tall building 800 meters away, vehicles and pedestrians on the street hundreds of meters away, or a full moon in the night sky, Apple believes that this telephoto lens is more suitable for taking "close-ups", such as close-up photos of people with a blockbuster feel, and children's wonderful goals in the game.


Even Apple only mentioned the 5x optical zoom capability throughout the press conference.

Everyone knows that a lens that supports 5x optical zoom can obtain 100x or 120x images through further digital cropping, but Apple obviously understands this matter very clearly. Compared with the blurry image after 100x magnification, it is very critical to find a suitable positioning for this telephoto lens.


Before Apple made it, periscope lenses were used as "telescopes." After Apple made it, periscope lenses may gradually move toward the position of "a high-power telephoto lens" and become more practical and commonly used.

If we hadn't looked carefully in the detailed notes on the official website, it would be difficult to find that the maximum zoom factor supported by this 5x telephoto lens is 25x.

In addition to the positioning of the lens, in terms of hardware technology, the telephoto lens made by Apple is actually very different from the solution of the Android camp.

The main differences are two: one is this quadruple reflection prism, and the other is 3D displacement anti-shake.

Judging from Apple's disassembly diagram, there is only one prism inside the telephoto lens. The cross-section of the prism is a "parallelogram". Light enters from one end and is reflected vertically on the CMOS image sensor after four reflections.


The more reflections, the longer the light path can be achieved in a limited space to achieve a longer focal length. Huawei has previously achieved five light reflections through a combination of three prisms, thereby achieving 10x optical zoom.

Comparing with the previous disassembly pictures of periscope lenses of many mobile phones such as Samsung, OPPO, Xiaomi, etc., the structure of Apple's telephoto lens is quite special. The order of light passing through is lens, prism, and sensor. However, the order of light entering periscope lenses of the Android camp is mostly lens, prism, lens group, and sensor.


Samsung solution (top) and OPPO solution (bottom), source: Source: JerryRigEverything, WekiHome

Apple's telephoto lens has a more compact structure, and there are no other lens groups between the prism and the sensor.Using a prism to achieve four reflections has not been seen in other mobile phones.


at the same time,Apple’s telephoto sensor is “lying flat” at the bottom of the fuselage.Sensors in the Android camp are usually placed vertically. The advantages of laying it flat include that the sensor size will be smaller. The size of this sensor is larger than the telephoto of the iPhone 14 Pro Max.

At the same time, when the sensor is placed flat, the movement space of the sensor and the entire anti-shake focusing module will be larger, and the anti-shake effect will be better.

In addition to the special prism and optical path structures,Apple uses a "3D sensor displacement optical image stabilization and autofocus module" in this telephoto lens, which can shift in the three directions of the X, Y, and Z axes to achieve anti-shake.


Apple calls this solution their "peak level". The fine-tuning frequency of this module is 10,000 times per second, which is twice that of the previous generation.

This anti-shake ability will be reflected through the "floating window" function when users take high-power digital zoom images. We have seen similar functions on Samsung mobile phones before, and we are no stranger to them.


Apple's sensor-displacement anti-shake has been used for many years, but previously it mainly moved in the horizontal direction of the X and Y axes. This time it has been upgraded to "stereo" anti-shake and applied to 5x telephoto. There is no precedent in the industry.

In the opinion of some professional photographers,For telephoto photography, especially at focal lengths of 120 mm and above, anti-shake is one of the most critical factors for image quality.Therefore, Apple’s technological breakthrough in anti-shake is crucial to ensuring the final imaging effect.

In addition to stability, the amount of light entering the lens is also one of the key factors affecting the film effect.According to Apple’s official statement, they have designed this 5x telephoto lens to have the industry’s largest aperture of f/2.8 at the same focal length.It is the same as the 3x telephoto of the previous generation iPhone 14 Pro Max. In other words, Apple has not sacrificed the aperture size on the basis of increasing the equivalent focal length.


Data comes from the official websites of each brand

Judging from the comparison of the parameters of the periscope telephoto lens of the current flagship models of major brands, Apple has indeed done it.

Judging from the available information, the special design of Apple’s telephoto lens may have something to do with circumventing Samsung’s patents. According to previous Korean media reports, Apple was stuck due to Samsung patent issues during the development process of periscope lenses. In order to circumvent Samsung patents, Apple had "racked its brains", but the final solution has not been announced.

This time the boots hit the ground, Apple’s plan is obviously very different from Samsung, Xiaomi, OPPO and other plans in terms of structure, and it is also different from the related patents previously exposed.


Periscope lens-related patents announced by Apple in August 2023

That is to say,Apple is likely to have mastered many patented and exclusive technologies in this lens.When more technical details are released later, we may have a deeper understanding of it.

05.

Why does A17Pro, whose performance improvement is “not enough”, have the potential to subvert the gaming industry?

Speaking of the A17Pro chip, first of all, its naming may hint at some of Apple’s possible subsequent plans.

For example, will the A17 Pro, whose specifications are slightly "castrated", appear in the next generation iPhone as "A17"? Will A17 chips of the same generation with different names be used in tablets, monitors, all-in-one products such as iPad and iMac?

A17Pro renamed,Perhaps it implies that at this current node, the rhythm of one chip generation per year may have to change.


The number of transistors of the previous generation A16 was 16 billion, and the number of transistors of A17Pro increased by 18.7%. The CPU performance improvement and GPU performance improvement of A17Pro were 10% and 20% respectively.

Although we don’t know exactly how many of these transistors are allocated to the CPU and how many to the GPU, overall, the performance improvement of the A17Pro is less than what many people in the industry expected. After all, everyone has high expectations for TSMC’s 3nm process.

But judging from the results,By the 3nm stage, the performance improvements brought by the chip process are indeed no longer as obvious as before.For example, iterate from 28nm to 14nm, and iterate from 14nm to 7nm.

The width of the 3nm transistor components in Apple's A17Pro has reached the level of 12 silicon atoms.

Perhaps this is one of the reasons why Apple has "major changes" to the GPU architecture this time, but Apple did not disclose the specific IPC performance improvements of the A17Pro's CPU and GPU this time.


This time in terms of GPU, Apple has rarely emphasized its series of improvements in games. Among them, ray tracing technology has attracted a lot of attention, and the native launch of console 3A masterpieces has made many people sigh.“Why does your next game console need to be a console?


Qualcomm and MediaTek have previously released hardware-based ray tracing technology before Apple, but currently the games and applications that can be implemented for consumers are still relatively limited. At this time, Apple's ecological appeal advantage is highlighted.

One of the biggest advantages of developing applications in the Apple ecosystem is the huge number of users and the ease of development.Application development among Apple's major mobile devices is developed in one step and deployed in multiple stages. At the same time, being the first to apply new technologies and log in to Apple's platform also has positive value for game manufacturers and developers.

What I have to mention here is that the iPhone has begun to natively adapt to 3A stand-alone game works on various consoles and PCs, including the familiar "Resident Evil" series and "Assassin's Creed" series.


Apple's introduction of light tracing technology, MetalFX, grid rendering, etc. seems to imply that mobile games and console games are developing in the direction of "integration and integration."The mobile gaming experience can use these new technologies to achieve qualitative changes in picture quality and experience.

At the same time, MetalFX and other oversampling technologies such as NVIDIA DLSS and AMDFSR also reduce the pressure on mobile hardware processing, allowing high image quality and smoothness to be balanced.


Play Death Stranding on iPhone

For the current gaming industry, Apple’s entry is likely to trigger an accelerated integration of mobile and console PC games.

In the future, more well-known game masterpieces may be released simultaneously on PC, consoles and mobile phones. This experience is unprecedented for many players.


Of course, Apple has indeed "drawn a lot of pie" now. Whether the final implementation will have the effect as advertised, and whether these new technologies can be well applied to mobile games, we will have to see the specific "efficacy" at that time.

After all, games have never been considered a "strength" for Apple before.

In addition, a sentence mentioned by Apple when introducing light tracing technology also caught my attention. Apple said that light tracing technology can also bring a more immersive AR experience. How can light tracing be combined with Apple's head display? Apple's press conference may have left more questions than it answered.


06.

What does it mean for data to run locally? Apple’s big AI announcement is about to come out

In the chip part, in addition to the GPU, the improvements in the neural network engine part are also worth noting.

Translated, it means that Apple may still make big moves in AI.

One of the most intuitive effects of the 35TOPS computing power of the neural network engine in A17Pro is that some of Apple's AI capabilities can be run locally, and the data is completely processed locally without having to go to the cloud.

In contrast,Qualcomm's fourth-generation cockpit chip Snapdragon 8295, which has been treated like a star by car companies a while ago, has an AI computing power of only 30TOPS. Apple is somewhat trying to kill a chicken with a bull's-eye.

Localization processing and data privacy and security protection are among the issues that the industry and consumers are most concerned about when large-scale AI models are implemented.

Whether it’s localized photo cutouts or creating personalized voices, it can be said that Apple’s series of “infrastructure construction” in AI may be paving the way for more products and experiences in the future.

"Information News" once reported that the capability level of Apple's self-developed large model "AjaxGPT" may be above GPT-3.5, and the number of parameters exceeds 200 billion.

Obviously, it is impossible for a large model of this scale to run natively on the iPhone, but will Apple, like Google, optimize a version that can run on mobile devices?

In terms of taking pictures this time, the iPhone's automatic capture, processing and analysis of the depth information of the shot, automatic turning on the portrait mode, and flexible adjustment of the depth information of different subjects in the photo after shooting are all inseparable from the support of the AI ​​computing power and algorithms behind it.


In the movie effect mode, AI is also indispensable behind the real-time automatic switching of the focus of the picture and the wonderful close-up shots.


Of course, the aforementioned super-score, frame-filling and other similar technologies used by the iPhone in the game will most likely involve the application of AI algorithms in the process of technology iteration. For example, NVIDIA's DLSS3 and DLSS3.5 apply a series of AI technologies to improve game frame rate and image quality.


It can be said that many people are looking forward to Apple releasing some new big moves in the field of AI, but in fact Apple's application of AI has already penetrated into every part of its software and hardware ecosystem.

07.

Conclusion: Disruptive innovation may not have to come in stormy waters

After sorting it out, although Apple's Spring Festival Gala lacks some "gimmicky" new features on the surface, some detailed changes in hardware, software, algorithms, and ecological levels are worthy of attention.

Apple still does what it does best:On the premise of not increasing user learning costs, all new functional experiences are polished and delivered to you, and at the same time, many technological breakthroughs are quietly achieved at the bottom, and the trends and directions of industry changes are often brewed in these breakthroughs.

Instead of filling the screen with more "far ahead", it is better to sit down and take a look at how this technology company, whose market value has dropped by nearly 600 billion yuan in two days and still maintains the world's largest market value, makes products.